Voices Anand Neelakantan Ravi Shankar Shinie Antony Dinesh Singh S Vaidhyasubramaniam Swami Sukhabodhananda MAGAZINE Buffet People Wellness Books Food Art & Culture Entertainment NEW DELHI january 7 2024 SUNDAY PAGES 12 e-volution AIRAWAT, India’s 21st century supercomputer, sits among the world’s top 100 even as PC sales hit 286 million worldwide last year. This ubiquitous machine is the dominant artefact of civilisation now, and it’s going new places T By Rijul A Das o think it all began with a room-sized machine. The Difference Engine, designed by Charles Babbage in the 1820s, is probably the world’s first computer. It used a set mechanical framework to carry out only one mathematical function, much like an industrial-era abacus. Babbage is called the “father of the computer”; his Engine prepared numerical tables using ‘method of difference’, a mathematical technique used in navigation and astronomy It was a miracle of technology at that time. Now . computers drive life and lifestyles. They run your car, airplanes and your microwave, a city’s power grid, traffic lights, are used in surgeries and diagnosis; there’s very little that is non-computerised left in our lives. They are used at work, for social media, gaming, and entertainment, and robotics. Portable laptops are everywhere, their young owners in cafes and cabs working lost in nerdy concentration with java and JavaScript. The 20th century was the Age of Data and the PC was its whisperer. Last year, India’s supercomputer AIRAWAT made it to the list of the world’s 100 most computers. Cybersecurity Analyst Vishesh Mahajan, who has his own cybersecurity firm based out of Singapore, says: “Projects like AIRAWAT will ensure that we revolutionise agriculture, education, and healthcare, using pattern recognition and prediction, using AI. Moreover, given the population of India, we have more than enough data to train our own models and develop strategically unique solutions, to problems unique to us.” Back in the 18th century , Babbage didn’t know a byte about PCs or software. He couldn’t predict that one day a young genius named Elon Musk would become a billionaire, after his father took him to a hotel where he played computer games. Musk did; he taught himself to code and built his first personal computer (PC) in his father’s garage and sold it for $500. Now PCs cost less, are lighter, sleeker and faster. Babbage could never guess either that a geeky owl-spectacled boy called Bill Gates and his partner Paul Allen would see the cover of Popular Electronics and start Microsoft after selling a Basic language interpreter for a PC called Altair Altair. Gates and Allen didn’t even have an Altair; they used a simulator. The tale of modern computers actually begins with the likes of the Electronic Numerical Integrator and Computer or ENIAC, and Konrad Zuse’s Z3, two of the world’s first digital computers. These hulking monsters ran on vacuum tubes, devoured enough electricity to power a house for a week, and churned out solutions to some of the most complicated arithmetic problems way faster than any human. The biggest deal, however, was that these computers could be reprogrammed to carry out diverse kinds of calculations, all at once. The world’s first digital computers were a marvel of their time, and yet were dwarfed by the invention of the personal computer, and then the laptop. Weren’t we all chuffed to dial up to the Internet on those 386s and 486s? By the time those boxes gave way to folding personal computers that could be carried in a bag, the Internet had become faster too. Now there’s Wi-Fi and supersleek laptops with touch screens. These pioneers paved the way for a steady march towards smaller, faster machines, each generation marking a leap in power and accessibility . Abacus (c. 2700 BC) Used by kids to learn basic mathematical skills, this earliest known computing device, consisting of beads on rods, originates in ancient Mesopotamia. First Generation: Vacuum Tubes (1940-1956) The first programmable computer, Z1, dates back to 1936, created by Konrad Zuse in Berlin. The first-gen computers evolve in the 1940s and make use of vacuum tubes and punched cards for data processing. Second Generation: Transistors (1956-1963) 1947 sees the invention of the transistor by Bell Labs. The second-gen computers still rely on punched cards, but use a symbolic (‘assembly’) language. Programma 101, a 65-pound machine—the size of a typewriter—is born and has 37 keys and a built-in printer. Today giants like Dell, Lenovo , and Apple can fit into our backpacks, while smartphones that would put the EINACs and Z3s to shame based on their computing prowess nestle comfortably in our pockets. It’s a tale of constant miniaturisation and ever-expanding capabilities, now beginning to rapidly telescope into a whole new world of extremely personal computers that harness the latest innovations and technologies. Wear We’re Headed Wearable computing is the new big thing. Smartwatches such as the Apple Watch, Samsung’s Galaxy Watch, or several other offerings from the likes of Garmin have shown the way These wrist-strapped . devices carry out functions that even the most sophisticated laptops could carry out just a decade ago. Put another way the , smartwatch of today uses more computing power than NASA used in designing, creating and launching its Apollo mission to the moon. It has started to get fantastic now. We are now moving into an age where all the computing power you could possibly need comes from a matchbox-size Down the Ages device that you wear on your lapel. We are, of course, talking about the Humane AI Pin, a ‘smartphone; that projects a display on your hand, or a wall, and lets you interact with your ‘phone’, using just your voice. How about a pair of sunglasses with a full-fledged computer inside, along with a camera, speakers, a touchpad and a battery to power all of this for hours? Well, Meta and RayBan have made a pair of smart glasses that does just that. Back to Biology Computing hardware is reaching its silicon limits in the relentless march of miniaturization. But what if the answer to our processing power woes isn’t smaller chips, but living ones? Think biocomputing. Imagine neurons, the building block of the human nervous system and nature’s own logic gates, firing away inside a dish, calculating the mass of a dying star or figuring out the next move in a murderous game of chess. That’s what biocomputing is. Biomechanical scientists and engineers are harnessing the inherent computational power of living cells, especially nerve or brain cells to perform tasks We imagine several scenarios where Mojo Vision can be useful. People with low vision could use a supplementary high-resolution camera integrated into glasses or suspended near their ears. Steve Sinclair, Mojo Vision’s SVP of Product, on Mojo Vision Contact Lens’viability as a medically approved assistive device beyond the reach of even the most advanced silicon chips. Research centres like the Final Spark in Switzerland, Cortical Labs in Australia and Koniku in the US are all vying to make the next superpowerful processing chip, something like Third Generation: Integrated Circuits (1964-1971) Using a semiconductor material that contains thousands of transistors, the computer becomes more reliable and fast, is small in size and more affordable. Punch cards are replaced by mouse and keyboard. Xerox Alto is born that can print documents and send emails. Fourth Generation: Microprocessors (1972-2010) The world’s first commercially available microprocessor makes its entry. The first Macintosh computer in introduced in 1984, followed by the iMac G3 in 1998, which includes USB ports. It’s an era of change. Tim Berners-Lee invents the World Wide Web in 1991. Mobile computing follows in 2000s, with the advent of handheld devices. Fifth Generation: Artificial Intelligence (2010 onwards) ChatGPT becomes a game-changer. Google claims to have achieved “quantum supremacy” in 2019. The landscape is still evolving. NVIDIA’s AI-processing GPUs, using brain cells and other animal cells. Dr Fred Jordan, the CEO and co-founder of Final Spark is focussed on developing the first AI processing chip using biological cells. “We’re at the beginning of a revolution. The way the brain processes information is incredibly intricate, and today’s digital computers simply aren’t up to the task. So, we thought, since hardware alone isn’t sufficient, let’s revolutionize it with living neurons or ‘wetware’.” The advantages of using animal cells are tantalising. Biological systems are inherently fault-tolerant, self-repairing, and operate with significantly lower power than traditional silicon-based electronics. Biocomputers can also tackle problems like protein folding or drug discovery with Turn to page 2
Express Network Private Limited publishes thirty three E-paper editions of The New Indian Express newspaper , thirty two E-paper editions of Dinamani, one E-paper edition of The Morning Standard, one E-paper edition of Malayalam Vaarika magazine and one E-paper edition of the Indulge - The Morning Standard, Kolkatta.