BHS

From Crown to Code: Technological Evolution Since the Last UK Coronation

Over the weekend, one of the most significant events in the modern history of the UK took place. On May 6, 2023, King Charles III, the current British monarch, was officially crowned.

The previous coronation was held 70 years ago, back in 1953, when King Charles’ mother, late Queen Elizabeth II ascended to the throne. Her Majesty Queen Elizabeth II ruled the United Kingdom from 1952 until her death in 2022. In the interim of 70, the world has undergone critical changes.

But what has changed in terms of technology?

Reading Time: 7 minutes

tech progress

Illustration: Lenka Tomašević

In the 1950s, when the last coronation of a monarch in the UK occurred, technology was vastly different from what we see today. Since then, there have been groundbreaking advancements in various fields that have significantly transformed the way we live, work, and communicate. Without a doubt, the world of has changed beyond recognition. 

Though some of the changes were revolutionary at the time, many of them are taken for granted nowadays. Here is just a tiny portion of them.  

Integrated circuit 

The integrated circuit (IC) is one of the most revolutionary technological advancements in modern history. This technology is composed of tiny electronic components such as transistors, resistors, and capacitors embedded into a semiconductor material, typically silicon.  

The first patent for an integrated circuit received an award on April 25, 1961, after it was invented by Jack Kilby and Robert Noyce respectively in 1958/1959. Since then, ICs have transformed the world of electronics by reducing costs, increasing efficiency, and shrinking device sizes. 

integrated circuit

Source: Wafer World 

ICs have paved the way for the development of groundbreaking innovations such as desktop computers, smartphones, and Internet of Things (IoT) devices. They have enabled the creation of complex circuits that are smaller, faster, and more energy-efficient than ever before.  

As technology has advanced, ICs have continued to evolve into even more powerful forms such as microprocessors, microcontrollers, and system-on-chips. These advanced ICs have pushed the boundaries of computing power and energy efficiency, allowing for more complex and sophisticated electronic devices. 

Personal computers and Internet 

The advent of integrated circuit led to another significant invention and development – personal computer (PC) and the internet.  

The Digital Age, as our modern age is frequently referred to, started with the emergence of the first production transistor-based computer, the renowned IBM 701. Though transistor-based computers had existed before, they were not available to the public.  

The advancement of microprocessors, which boosted computational power, made the production of computers feasible, paving the way to mass production of personal computers. Later on, companies such as Apple and IBM popularized them, making computers available to multiple users. 

The emergence of the internet in 1969 marked a major milestone in the history of technology. ARPANET, a U.S. Department of Defense project designed for academic and research purposes, opened the way for the development of the internet we know today.  

The standardization of TCP/IP protocols in the 1980s enabled the creation of the World Wide Web by Tim Berners-Lee in 1989 and the first web browser in 1990. 

The 1990s saw an unprecedented commercialization of the internet, leading to its rapid expansion and widespread adoption. The emergence of search engines, e-commerce platforms, and social media transformed industries, communication, and society as a whole.  

The internet became an integral part of our daily lives, providing us with access to vast amounts of information, enabling us to connect with people across the globe, and empowering us with tools to express ourselves and engage with the world around us. 

Mobile phones 

The development of mobile phones since 1953 is yet another important technological advancement. 

These portable communication devices, with the ability to make voice calls, send text messages, and access the internet through cellular networks, have come a long way. The first commercial handheld mobile phone, the Motorola DynaTAC 8000X, was released in 1983 and mainly designed for voice communication.  

However, the 1990s saw a rapid increase in mobile phone adoption. At the time, mobile phones witnessed improvements in battery life, network coverage, and design. The advancement also came in terms of price – the first Motorola DynaTAC 8000X cost the equivalent of $11,700 if you purchased it today.  

The new millennium witnessed the emergence of so-called feature phones. These new models offered minimal multimedia features such as simple games, cameras, and music players.  

people using smartphones

Source: NBC News 

In 2007, Apple released its iPhone 1, the first smartphone ever which boasted advanced computing abilities, high-resolution touchscreens, and app ecosystems. Such an innovative step revolutionized the mobile phone industry people knew then.  

Later on, the introduction of Android and iOS, the two dominant operating systems, led to the development of thousands of applications that provide users with functionalities ranging from navigation to mobile payments. 

Today, smartphones are everywhere. Advancements in mobile technology have transformed how we communicate, access information, and entertain ourselves. With the functionalities they offer, smartphones have become an essential part of modern life. If someone from the 1950s were to see a smartphone today, they would probably think it was some kind of sorcery or magic. 

Space travel and exploration 

“That’s one small step for the man, but a giant leap for the mankind.” 

The expansion of the human species into space could possibly be among the most momentous advances in technology since the early 1950s. Since the 1950s, space exploration has been characterized by major accomplishments, scientific advances, and international cooperation. 

In 1957, with the launch of Sputnik 1, the first artificial satellite, the so-called Space Race between the Soviet Union and the United States officially started. Aboard Vostok 1, Yuri Gagarin became the first human who flew to space in 1961. The U.S. Apollo 11 mission made history in 1969, when Neil Armstrong and Buzz Aldrin walked on the Moon. 

The upcoming decades witnessed key achievements in space exploration. Some of them include the building of reusable spacecraft like NASA’s Space Shuttle program (1981–2011), modular space stations involving Russia’s Mir (1986–2001) and the International Space Station (ISS).  

Space exploration has greatly benefited from uncrewed missions as well. Robotic missions that have visited planets, moons, and other celestial bodies, such as the Voyager, Mars Rover, and New Horizons missions, have improved our understanding of the solar system. 

Private enterprises have recently entered the field of space exploration too, focusing on cost-cutting, reusable rocket technology, and space tourism. SpaceX’s Falcon Heavy and Crew Dragon spacecraft have been particularly influential, enabling cargo and crewed missions to the ISS. These developments have opened new possibilities for space exploration and could lead to further innovation in the future. 

The DNA structure discovery 

1953 was the year when James Watson and Francis Crick discovered DNA structure. Their groundbreaking discovery, founded on experimental data, transformed our knowledge of genetics, setting the ground for further developments in the fields of molecular biology and genetics. 

Their work on DNA resulted in the Human Genome Project (HGP) an international scientific research project whose goal was to map the entire sequence of human DNA, containing roughly 3 billion base pairs. 

human genes

Source: Medical News 

The HGP was an international research initiative that began in 1990 and was finished in 2003. It was coordinated by the National Institutes of Health (NIH) in the United States and the Wellcome Trust in the United Kingdom. 

Numerous scientific disciplines, including genomics, medicine, and biotechnology, have been significantly impacted by the HGP. It has sparked the creation of novel diagnostic methods, individualized medical strategies, and focused treatments for a range of diseases. The initiative has also sparked improvements in DNA sequencing techniques that have greatly decreased prices and improved accessibility for a wider range of applications. 

The ascent of AI 

Artificial Intelligence (AI) is one of the most far-reaching technological advancements of the 21st century. It may have been farfetched back in 1953, but today, it is everywhere around us.  

The development of AI has come a long way since its early beginnings in rule-based systems, such as Arthur Samuel’s Checkers program. In the past few decades, more precisely the 1970s and 1980s, we witnessed the rise of expert systems and knowledge-based systems – MYCIN and DENDRAL – as well as the emergence of AI languages like LISP and Prolog. 

The 1990s and 2000s were marked by the advent of machine learning, leading to breakthroughs in speech recognition, computer vision, and natural language processing. However, the most significant breakthrough came in the form of deep learning in the 2010s.  

From 2020 onwards, AI research has advanced quickly and has been incorporated into many aspects of daily life. With powerful architectures such as convolutional neural networks (CNNs), recurrent neural networks (RNNs), and transformer deep learning models, AI applications have revolutionized the way we interact with technology. 

In recent years, AI has become a part of our daily lives, powering virtual assistants like Siri and Alexa, recommendation systems on social media platforms, and self-driving cars.  

However, as AI becomes more prevalent, ethical considerations and responsible use have become increasingly important. Though the integration of AI into various fields has the potential to revolutionize the way we live and work, it must be done with caution and care. 

The rise of social media 

Social media has undergone a seismic transformation since its early days. Platforms such as SixDegrees and MySpace kick-started the era of connecting users based on shared interests and connections in the late 1990s and early 2000s. 

However, it wasn’t until the mid-2000s, when Facebook and Twitter made their debut, that social networking truly took off. The new networks empowered users to share updates, photos, and messages with their networks. The advent of YouTube further transformed content creation and consumption, democratizing video-sharing like never before. 

social media

Source: Euro-Med Human Rights Monitor 

But the real explosion of social media came in the 2010s, with the arrival of visually-driven platforms like Instagram, Snapchat, and TikTok. Short-form videos and attention-grabbing visuals became the norm. LinkedIn, on the other hand, emerged as the platform of choice for professional networking.  

Social media’s impact on society expanded rapidly, with businesses, marketing, and activism leveraging its immense reach to drive global movements and viral content. Yet, with the rise of smartphones, concerns about privacy, mental health, and the spread of misinformation have grown ever more pressing. 

Blockchain and crypto 

Much like artificial intelligence, the notion of blockchain and cryptocurrencies was inconceivable back in the 1950s. No one could ever imagine that someday it would be possible to pay with or transfer virtual money anywhere across the globe. 

In the past decade, blockchain and crypto have become the two most discussed innovations. Blockchain refers to a decentralized digital ledger that records transactions on a secure and transparent network. This way, it enables secure and tamper-proof record keeping. 

Owing to its features such as decentralization and transparency, blockchain technology has the potential to revolutionize various industries. Those may include but are not limited to finance, banking, healthcare and supply chain management. It provides a way to securely transfer and store data without the need for intermediaries thus increasing efficiency, reducing costs, and improving transparency. 

Cryptocurrencies stand for digital or virtual currencies that use encryption techniques to secure transactions and control the creation of new units. They have gained popularity as an alternative to traditional currency, with some investors seeing them as a valuable asset class. 

The first and most popular cryptocurrency, Bitcoin (BTC) was created in 2009 by a pseudonymous Satoshi Nakamoto. Six years later, it got its fiercest rival, Ethereum (ETH), which became the second most popular crypto. In 2023, there have been over 9,600 cryptocurrencies according to CoinMarketCap. 

The rise of blockchain technology and cryptocurrencies has also raised concerns about security, regulation, and their potential use in illicit activities. Nonetheless, their potential impact on the global economy is significant, and their development is an exciting area to follow. 

"Ever tried. Ever failed. Never mind. Try again. Fail better."

EDITOR'S CHOICE

Subscribe to our newsletter and stay updated !