Kimberly Powell has been a first-hand witness to Nvidia’s spectacular transformation. When he was hired, sixteen years ago, the group realized that GPUs, these chips dedicated to processing video game graphics – the firm’s in-house invention in 1999 – were as excellent at bringing pixels to life as at computer calculation for scientific purposes. The following decade, Nvidia’s computing power became highly coveted around the world. It will even explode around 2022, with the advent of generative AI, these programs capable of creating texts, images or sounds by themselves. The company founded by the American-Taiwanese Jensen Huang, took advantage of this, and today established itself as the largest market capitalization in the world, surpassing Apple as well as Microsoft, at more than 3,000 billion dollars. Its price gaining 200% over the last year alone.
Powell, now global vice president specializing in health, discusses the concrete impact that GPUs and more broadly AI will have on the discovery of diseases and the treatment of patients. According to her, surely the most decisive sector, one day, for Nvidia, given the needs and the quantities of data available to feed its machines.
Why does medicine need computing power today?
Kimberly Powell Radiology and medical imaging are good examples. X-rays, scanners and MRIs or even ultrasounds, non-invasive technologies for diagnosis, have existed in some cases for more than fifty years. For a quarter of a century now, computers have made it possible to digitize the images obtained, and the growth in computing power has made it possible to make ever better use of them. One of my first projects at Nvidia was to observe a baby in utero in 3D for the first time, through a partnership with Siemens.
Today, the data collected by the numerous sensors in the machines allows the reconstruction of images of our anatomy. Other areas of health that really need IT include genomics or molecular dynamics, drug discovery. It was only when accelerated computing arrived, with the computing power provided by GPUs, that scientists were able to carry out simulations both more quickly and on a multitude of parameters. And it was here that we began to understand the interaction of molecules with each other and to make new progress. Then, medicine needs computing power for AI.
What do AI, and more recently, generative AI, actually bring to the field of health?
If we take the example of radiology, basically, what does it consist of? This is the idea for doctors to spot abnormalities or trends to help them understand what is happening in the patient and then treat them. AI helps by controlling the quality of the image obtained, by targeting elements that the doctor can examine in more depth. Over the last ten years, we have also seen the creation of numerous models, specialized in the detection of brain tumors using an MRI, for example. Or another capable of measuring, in cardiology, the ejection fraction, or the quantity of blood ejected with each beat in a heart chamber. Very specific segments. Since the appearance of “transformer” neural networks in 2017, at the origin of generative AI, it is possible to create non-task-specific models, using large quantities of diverse medical imagery. A bit like LLMs, which we use for example in ChatGPT, but in health. This opens up a better understanding of images and beyond detection, recognizing for example the characteristics of a tumor to better treat it.
“By 2030, healthcare will be the world’s largest data industry”
The next step is that each academic establishment, start-up, pharmaceutical company can develop its generative AI models from its own data. Then, finally, bring AI into the operating room, via platforms that we offer such as Holoscan, and robots like that of our partner Moon Surgical, a start-up based in Paris. That is to say, for surgeons, having access to AI algorithms in real time during their operation. All this makes me think that the greatest contribution of generative AI will be in medicine.
For what ?
You know, many nations today have similar problems: aging populations, therefore difficulties in managing chronic care and overworked health professionals. So we need a level of automation. Both to improve the skills of doctors, free up their time, and give drug researchers 10, 100, why not 1 million times more chance of achieving their next discovery. AI addresses a lot of essential questions: how can we help patients better understand their health status? How can we be better at prevention thanks to all the data we have? How to limit medical errors? How can we help 2 million more surgeons? Because, it must be remembered, only a third of the world has access to surgery or activities such as radiology and medical imaging to date.
What will be the impact of AI and generative AI on health in ten years?
By then, and even before, in 2030, healthcare will be the largest data industry in the world. Because we are digitizing everything related to healthcare, imaging. We are currently sequencing large populations, and more than once. We run life science lab data lab equipment twenty-four hours a day so we can observe how cells respond to drugs. Just as the entire Internet powers a ChatGPT, there will be programs to bring the entirety of all scientific experiments into one computer. And it will be global. Because all these tools, moreover, are transforming in the era of AI, including smartphones, which directly embed the technology. Access to the information will be from there. Furthermore, many programs today are open source and easily accessible. Which means we will be able to conduct research faster than ever. No one can say that we have a perfect mastery of biology today. Because the way we’ve always learned it is either through evolution or through experimentation. These two aspects alone are not enough to understand living things.
Cancer is the leading cause of premature mortality in France. One of the main ones around the world. Can we expect major advances soon in the fight against cancer with AI?
I don’t see how that couldn’t be the case. One of the great complexities of cancer is that it is unique every time. Each therapy is therefore personalized. Naturally, no doctor can have all this information. Just being able today to collect the experience of each cancer patient, their journey, their treatment, and being able to process this information in a computer should greatly help all care.
Many hope to have a doctor in their pocket, do you think this will be possible one day?
I think this has already happened to some extent. First of all, doctors have doctors in their pockets, right? They work every day consulting with their partners and getting second opinions from people who know about it. But, of course, ChatGPT’s capabilities somehow make it possible to integrate all the knowledge in the world. So instead of the doctor currently having one or two opinions, with ChatGPT they could have over 1 million medical experiences in their pocket. And the same goes for the patient. Afterwards, an LLM – the program behind ChatGPT – is an excellent planner. However, it needs other technologies, such as RAG – recovery augmented generation – to tap into reliable knowledge. These may consist of government documents, hospital policies, your personal electronic health record and its current information. But these systems are not yet fully developed.
Health data is among the most coveted by hackers. And many fear delivering data as personal as that related to health to AI and clouds outside their borders. How do you address your fears?
In truth, I unfortunately think that various services have been collecting data on us for about two decades and ultimately already know more about us than we know about ourselves. But it is obviously a crucial question, at all levels: that of patients, and that of health systems more broadly. Several confidentiality techniques exist, I am thinking of federated learning, which is developing massively. This helps train more intelligent and personalized AI models while protecting users’ privacy because their data remains on their devices. In the spring, Nvidia also launched “Nim”. The idea is to encapsulate computing power in small boxes, in order to import them wherever you need. Instead of the service being only available on a cloud provider, the AI - a model like Meta’s Llama-3 for example – can run locally on a computer, a hospital data center, inside of a surgical robot. Just about everywhere.
.