The Joy of Why - Can Thermodynamics Go Quantum?

Episode Date: September 12, 2024

The principles of thermodynamics are cornerstones of our understanding of physics. But they were discovered in the era of steam-driven technology, long before anyone dreamed of quantum mechan...ics. In this episode, theoretical physicist Nicole Yunger Halpern talks to host Steven Strogatz about how physicists today are reinterpreting concepts such as work, energy and information for a quantum world.

Transcript
Discussion (0)
Starting point is 00:00:00 In the mid-1800s, engineers were grappling with questions at the forefront of the Industrial Revolution. How to convert steam into mechanical work, translate rushing streams into electrical energy, or pump water out of mines. Their inquiries and observations built the groundwork of a new science, thermodynamics. By the early 1900s, we had not one, but three laws of thermodynamics. These laws have since become ubiquitous and proven fundamental to our understanding of physics in everyday life.
Starting point is 00:00:43 But as our knowledge of the physical world continues to grow, the limits of these old mechanical notions become more apparent, especially as we approach the quantum scale. I'm Steve Strogatz, and this is The Joy of Why, a podcast from Quantum Magazine where I take turns at the mic with my co-host, Jan O'Levin, exploring the biggest unanswered questions in math and
Starting point is 00:01:05 science today. In this episode, we're going to ask, what do concepts like work and heat mean on an atomic or even subatomic level? And can our laws of thermodynamics be reinterpreted in quantum terms? We're joined by Nicole Younger Halpern. She is a theoretical physicist at the National Institute of Standards and Technology and an adjunct assistant professor at the University of Maryland. Her research lies at the intersection of quantum physics, information processing, and thermodynamics. She's also the author of an award-winning book,
Starting point is 00:01:44 Quantum Steampunk, The Physics of Yesterday's Tomorrow. Nicole, it's a great pleasure to have you with us on The Joy of Why. It's a delight to be here. Thanks for having me. Well, thank you for joining us. I really am excited about this. I was just asking my wife on the drive over to the studio about the word steampunk, I have to admit that I'm not familiar with this. She mentioned to me that even jewelry can be done in steampunk style. LESLIE KENDRICK Steampunk comes up in costumes, in conventions,
Starting point is 00:02:16 in jewelry, in film, in books and short stories, all over the place. It combines the aesthetic of the 1800s, so the Victorian era, people in waistcoats and petticoats and top hats, and also the American Wild Wild West and Beijing to Japan, as well as futuristic technologies. Uh-huh. Excellent. Well, it's a very intriguing title, Quantum Steampunk. I hope our listeners will check out the book. So let's start talking about the Victorian era and the thermodynamics. Ideas that grew out of that, we now think of it as classical thermodynamics. And I mentioned the laws of thermodynamics.
Starting point is 00:02:55 There are so many things to unpack. The three laws, ideas like work, heat, energy, entropy, efficiency. Take us through some of those ideas at least to start with. Thermodynamics is something that we all have a sense of, but maybe we kind of take it for granted. And maybe that's because thermodynamics is so general. It's the study of energy, period. The forms that energy can be in and the transformations amongst those forms.
Starting point is 00:03:21 Energy can be transmitted in the form of heat and in the form of work. Work is coordinated, directed energy that can be directly harnessed to do something useful, like power a factory or charge a battery. Heat is random, uncoordinated energy. It's the energy of particles jiggling about randomly. And heat engines turn this random heat into coordinated useful work. And heat and work feature in the laws of thermodynamics. The number has been growing a bit, depending on whom you ask. There might be three, there might be four, there might be five. The zeroth law of thermodynamics was actually developed after the first three,
Starting point is 00:04:04 but people thought it was so important that it should be given precedence. And so it tells us that there are thermometers. I suppose that you have a cup of tea and I have a cup of tea. We want to be able to compare their temperatures. How can we do that? We can do it using a thermometer. The first law tells us that the total amount of energy in the world remains constant. The second law tells us that the entropy of a closed isolated system remains
Starting point is 00:04:34 constant or increases only, at least on average. And the third law tells us that you can't actually cool any system down to the lowest conceivable temperature absolute zero, zero Kelvin in any finite number of steps. So that is a very brief history of thermodynamics. Excellent. Great. That is a good summary. The second law always feels to me like the really deep one. Yes.
Starting point is 00:05:02 Right. I mean, the concept of entropy is often phrased as some measure of disorder in a system. Do you want to talk to us about entropy just for a minute? Sure. I think of entropy as a measure of uncertainty. And all sorts of things can have entropies associated with them. For instance, the weather on any given day in the Boston area is extremely random.
Starting point is 00:05:24 It could be sunny or rainy or cloudy or snowy. And suppose that we learn on some given day what the weather is. So we've learned some amount of information. The amount of information we learn can be seen as an entropic quantity. And then suppose that we average this amount of information that we learn over all of the days. That's another entropic quantity, a pretty common one. And we can translate the story into thermodynamics by saying, we have physicists favorite thermodynamic system, a classical gas in a box. And
Starting point is 00:05:59 suppose that we know only large scale properties of a gas like the total number of particles and the volume. There are lots of different microstates or configurations that are consistent with this large-scale macro state. By a microstate, I mean a list of all of the particles' positions, all of their momentum, and maybe some other properties, depending on what sorts of particles we have. So if we know just these large scale properties, how ignorant are we about the microstates?
Starting point is 00:06:34 And that's essentially the thermodynamic entropy. It's amazing this idea of the gas in a box, because it's true, that is the universal example. And I'm actually at this very moment, sitting in a box because it's true that is the universal example and I'm actually at this very moment sitting in a box called a studio the door is closed there is gas in here it's the air around me I'm very glad I mean as far as I can tell now I suppose it's conceivable that all the air molecules could spontaneously go into the corner and you might hear me gasping, but that would be a
Starting point is 00:07:05 very rare event. Would that be a state if all the molecules were in the corner, that would be what, very low entropy I suppose? Right. So there's a lot of debate, especially in the philosophical community about how to define thermodynamic entropy, but the way that we're often taught to reason in statistical physics classes, we would tend to say, yes, the state in which the particles are all clumped together in the corner of the box is indeed a low entropy state. There's a concept that comes up a lot, equilibrium. Can you remind us, what does that mean?
Starting point is 00:07:38 Like when we speak of a system being at thermodynamic equilibrium, what is that? Why does it matter? Equilibrium is a rather quiet state of a system. It's a state in which large scale properties like the total temperature and total volume remain approximately constant over time. And there's no net flow of anything like heat or particles into or out of the system.
Starting point is 00:08:03 So suppose that we have had a hot cup of tea, we have let it sit on the counter for a long time, it has come to have the same temperature as the rest of the room, and a little bit of the water has evaporated away. At this point, the tea is at thermal equilibrium with its environment. And so it always seems kind of like an artificial thing
Starting point is 00:08:24 that happens in a chemistry lab or in this famous tea cooling off on the kitchen counter. Whereas in real life, you know, I'm eating food all day long, it seems, just had a cookie before coming to the studio. I can't relate to thermodynamic equilibrium very well. Is that fair to say? In our everyday life, what things are at equilibrium and what things are not? A great deal in our lives, including life itself, as you point out, is far out of equilibrium. Organisms keep themselves far out of equilibrium by doing just what you said, by eating so that they consume energy in a well-organized form and expel it in a very highly entropic form.
Starting point is 00:09:07 So you radiate lots of heat. This helps keep us far out of equilibrium. If you have run a bath and let it sit around too long, you might've experienced equilibrium unpleasantly or made a cup of coffee and gotten distracted by your work so that you end up having to drink cold coffee, you might have experienced equilibrium. I see. So it does seem like a sort of final state. It's like after everything settles down,
Starting point is 00:09:34 there's no drive for anything to change anymore, it sounds like. Exactly, there is no drive. So when you mentioned the different laws of thermodynamics, what kinds of systems do the laws apply to? What other caveats do we need to make about those systems in order for the laws to apply? Well, the laws of thermodynamics were originally formulated by people who had in mind large classical systems. They didn't necessarily think of these systems as consisting of many, many particles. The theory of atomism
Starting point is 00:10:05 was not entirely accepted by the Victorian era, but they were thinking of systems that, at least now, we will all acknowledge, consist of lots and lots of particles. Around the turn of the 20th century, people discovered Brownian motion, which is random jiggling of particles that's observable with a microscope. And it led people to accept very broadly that, in fact, materials do consist of very small particles, they jiggle around randomly, and occasional jiggling in the wrong direction led to some minor changes in at least the second law of thermodynamics. But what's really surprising to me is that the laws of thermodynamics seem to be going strong, even though we've learned a great deal since even the turn of the 20th century
Starting point is 00:10:55 about small systems, biological systems, chemical systems, and even quantum systems. Well, so we've been talking so far from the point of view of these particles that you keep mentioning, the atoms or molecules, systems made up of enormous numbers, but now we're going to start to get into the quantum aspects of thermodynamics with you. Is the main novelty conceptually the idea that we have just very few particles now, or is it that we're using quantum ideas instead of classical ideas. I see quantum thermodynamics as involving the extension of conventional thermodynamics to small systems, quantum systems,
Starting point is 00:11:34 and far-from-equilibrium systems. Although not all of these categories have been addressed only by quantum thermodynamicists. For instance, there was a lot of really amazing work done in far from equilibrium statistical mechanics during the 20th century in the field of non-equilibrium statistical mechanics, which is adjacent to and admired by and friends with quantum thermodynamics. Interesting.
Starting point is 00:11:58 So, if I heard you right, you said there are going to be three kinds of things to think about, far from equilibrium, quantum, and small numbers. All three of those we could think of as at the edge of what was traditional thermodynamics. By traditional, I mean like the subject that Gibbs and Maxwell and people like that helped develop in the late 1800s. They didn't have the math or the physical concepts to really handle small systems, far from equilibrium systems, or they wouldn't have even known about quantum systems at that point.
Starting point is 00:12:32 That's a good way of putting it. I'm surprised by your answer. It's interesting. I thought you would just say quantum, but you'll deal with large numbers. But so you can allow for small systems too. One of the reasons is suppose that we want to address quantum thermodynamics. That is complicated to do. So it can be simpler to make a model that
Starting point is 00:12:55 is amenable to small systems, solve some problems using this model. And after you've solved those problems, add in some more quantum features like coherences, which sometimes I describe as the wave-like nature of quantum particles. And this is in fact what happened a number of years ago in the intersection of quantum thermodynamics and quantum information theory. Some colleagues of mine created a model for certain systems.
Starting point is 00:13:28 They wanted for this model to describe quantum thermodynamics systems. But just solving the classical version of the small-scale problem was complicated enough. After that problem was solved, then people could make progress on the really quantum features. So the classical small-scale system problem was in service of the quantum thermodynamics. All right. So we have a lot to discuss here. I'm a bit daunted because it's conceptually very rich and feels to me very new.
Starting point is 00:13:59 The recent wave of quantum thermodynamics that has become widely accepted as a subfield is very new. There's been this trend over the past 10 to 15 years in which quantum thermodynamics has grown a great deal. Quantum thermodynamics first started being thought about during the 1930s. A quantum engine was proposed in the 1950s and 60s. There was work in the 80s. But these pieces of work were not always accepted by the wider community. And quantum thermodynamics itself was sometimes called an oxymoron because thermodynamics was developed for large classical systems. So people just couldn't understand what it could possibly have to say about quantum systems. But in the early 2010s or so, as I was starting my graduate
Starting point is 00:14:48 work, quantum thermodynamics started to grow a great deal, I think, for two reasons. First, the field of quantum information science had matured in the early 2000s. We could use it as a mathematical, conceptual, and experimental tool kit for understanding quantum systems through how they store and process information. We could use those tools in quantum thermodynamics.
Starting point is 00:15:10 And second, some people managed to secure a very large grant for quantum thermodynamics. And so over the past 10 to 15 years, quantum thermodynamics has really boomed, and I think that's why it feels so new. Well, let's unpack some of the words that you've been using here. We keep saying quantum, but maybe we should just offer a quick reminder of what are some
Starting point is 00:15:30 of the key features that distinguish quantum phenomena or quantum systems we keep speaking of from classical systems? Like, what's the hallmark of something being fundamentally quantum mechanical? Some features are quantum systems tend to be small. They can have wave-like and particle-like natures. They can be disturbed a great deal by measurement in a way that classical systems aren't. They can entangle with each other, so form very strong relationships, which lead to really strong correlations. A quantum particle can have only certain amounts of energy, not absolutely any
Starting point is 00:16:12 possible amount of energy from zero on upward. Yeah, that's really where the word came from, isn't it? Right. Quantum literally means a small packet of something. And so an atom, say, can receive a quant of energy and so jump between rungs on their energy ladder to go from one discrete amount of energy to another discrete amount of energy. Yeah, it's all very deep and mysterious. Is it fair to say that we discovered quantum ideas in the context of physics originally in the quanta of quantized energy levels in atoms, but really quantum ideas are more general than physics. For instance, could quantum theory be a kind of generalization of probability theory and
Starting point is 00:16:54 information theory divorced from applications to atoms? Quantum information science indeed has spread into computer science and mathematics and engineering and is also inherently in chemistry. Indeed, it has spread into computer science and mathematics and engineering, and it's also inherently in chemistry. I think of quantum information science in two ways. On the one hand, I think of quantum information science as, as I mentioned, a mathematical, conceptual, and experimental toolkit for understanding quantum systems through how they store and process information, such as these strong correlations
Starting point is 00:17:27 that I mentioned earlier, entanglement. And on the other hand, I think of quantum information science as the study of how we can use quantum phenomena, like disturbance by measurements, to process information. So solve computational problems, secure information, communicate information, and so on in ways that are impossible if we have just classical technologies. One of the ideas that I ran across while preparing to talk to you struck me as pretty
Starting point is 00:17:57 mind-blowing, and I'm hoping you can help enlighten me and our listeners about it, was the relationship between information and work. That work can sometimes be done by changing the amount of information in a system. Am I getting that right? Like erasing information is tantamount to doing work or something like that? So work is a resource in thermodynamics because if
Starting point is 00:18:22 we have work then we can push a rock up a hill or charge a battery or power a car. Just as there are thermodynamic tasks like charging batteries, there are information processing tasks like storing information and solving computational problems. And in information theory, information is a resource. What I find really interesting is that information can also serve as a resource in thermodynamics. If we have information and heat, we can kind of combine those to obtain work, which we could use to, say, power a car.
Starting point is 00:18:59 And also the reverse is true. Work can be a resource in information processing. You mentioned erasure. So suppose that you did a calculation and you've filled up a whole piece of scrap paper and now I need to do a calculation, I need scrap paper. And suppose you hand your calculation to me, I don't know what your handwriting looks like. So it might be beautiful. So no offense. Suppose it's just really bad handwriting, so I can't read anything. Then I need to erase that information in order to have useful scrap paper for performing my computation. That erasure process is going to cost thermodynamic work. There's a fundamental limit, the fundamental lower bound on the amount of work required to
Starting point is 00:19:42 erase information. So both information and work can serve as resources in both thermodynamics and information processing. ISKRA at least two stories, grow up out of the setting that you've described. One story goes by the name of Szilard's engine. Leo Szilard was a great Hungarian-American physicist. He said, suppose that we have a gas in a box. Let's just suppose for convenience and simplicity that the gas consists of one particle to make things really easy. And suppose that we know which side of the box the gas is in, the right-hand side rather than the left-hand side. We can think of this whole system as storing a bit of information. So a bit, the basic unit of information, is something that can have one of two possible values, one or zero, right or left. So if we know that the gas particle
Starting point is 00:20:48 is in the right-hand side of the box, we have one bit of information. We know it's in the right-hand side rather than left-hand side. And we can turn this bit of information into work. We can slide a partition into the box down the center. So we trap our gas in the right-hand side. And suppose that we let the gas interact
Starting point is 00:21:10 with some fixed temperature environments through the walls of the box, so heat can flow into the box or out of the box. Then we can hook up a little weight to the partition and let the partition slide. The gas is going to hit the partition and it's gonna keep punching the partition slide. The gas is going to hit the partition and it's going to keep punching the partition until the partition reaches the left-hand side of the box.
Starting point is 00:21:31 So now the gas can be anywhere in the entire box. The gas has expanded, and as it has expanded and moved the partition, the partition has lifted the weight. So we've lifted a weight, we've done useful thermodynamic work lifted the weight. So we've lifted a weight, we've done useful thermodynamic work on the weight, but we no longer know where in the box the particle is. We lost our bit of information.
Starting point is 00:21:55 So we traded information for work using heats that flowed into the gas from the environment. It's that heat which we transformed into work with help from our bit of information. It's really very vivid. I love that explanation that you just gave. Let me see if I really got it. I think I did.
Starting point is 00:22:15 One molecule or one particle in a box, we think of it on the right, we've got this partition, a sort of sliding wall, potentially. The whole box is in a room at a certain temperature. And then somehow, that's enough to get this particle bouncing around randomly in its available space. And occasionally, it will hit the partition. Is that the idea?
Starting point is 00:22:39 That I'm only going to hit the partition from the one side. So the partition moves unidirectionally right to left, like a piston expanding. In some jerky way, it's going to be sliding, gradually expanding the available volume for this particle. So far so good, right? Yes. And so we're losing information as that's happening in the sense that we're increasing the volume, and so we have less information about where the particle is until ultimately when we've jammed the partition all the way over to the left wall, we now know nothing. The particle could be anywhere.
Starting point is 00:23:12 I guess we have zero bits of information at this point about where it is. Exactly. Yet we did useful work through this Rube Goldberg setup that you described. I've done work by losing information. Exactly. Exactly. Unbelievable. That's crazy. One thing I really love about Cillard's engine is this was described in a paper by Leo Cillard that was based on his PhD thesis, which was praised by Einstein. And we're still talking
Starting point is 00:23:40 about it about a hundred years later. So I present this story as a motivation to grad students for something they can aspire to. Ah ha, so for listeners who wouldn't know this name, Leo Szilard, he's pretty famous in the history, not just of science, but the history of the world, because he's the person who wrote the letter that Einstein signed to tell President Roosevelt that the U.S. needed to build an atomic bomb before Germany did. And he was also heavily involved after the Manhattan Project in Causes of Peace.
Starting point is 00:24:18 Yes, right. We'll be right back after this message. Quanta Magazine is an editorially independent online publication launched and supported by the Simons Foundation to enhance public understanding of science. The Simons Foundation does not influence the content of the podcast that you are listening to nor any of the stories covered by Quanta Magazine. Welcome back. So this is Zillard's engine explaining how information can be traded for work. But you said there was a second example about a box and a partition. Is there another one that we should talk about? I believe you alluded to this earlier, Landauer Erasure. Okay, yes,
Starting point is 00:25:00 please tell me about those two words. Landauer, that's Rolf Landauer. It's an information scientist at IBM. Uh-huh. He basically reversed C. Lard's engine. Suppose that there is a gas in a box. Again, let's think of it as being one particle. And suppose we don't know where the gas particle is. Its position in the box is totally random.
Starting point is 00:25:26 And we want to reset the particle's position to a nice clean state, the right-hand side of the box. This is like taking that sheet of scrap paper that has been scribbled on totally randomly and erasing it to a nice clean state. We can do that by sliding a partition into the box, now right next to the left-hand wall, and pushing the partition to the box's center.
Starting point is 00:25:50 The gas is going to be trapped in the right-hand side. Now, in order to slide the partition, we have to compress the gas, and this compression process requires us to spend work. At the end of the day, we have a bit of information. We know the gas is in the right hand side of the box rather than the left hand side, but we have spent work. So in this case, we could trade work for information. And why is this a conceptually important thought experiment? You're saying this is another way of understanding that there is a trade-off between work and information? In part, yes. As you said, there's a trade-off. Both work and information are useful in information processing and thermodynamics. You alluded to something deeper. I agree about that impulse. If we want to compute and keep computing and keep computing,
Starting point is 00:26:42 we're going to run out of scrap paper sometime. The universe doesn't have an infinite supply, so we're going to run out of scrap paper sometime. The universe doesn't have an infinite supply, so we're going to have to erase some. Charlie Bennett, somebody else at IBM, has argued that just ordinary computations can be performed at zero energy cost. It would be very, very difficult to do and very impractical, but in principle possible. However, when it comes to erasure, there is this unavoidable work cost. So there is some part of computation, namely erasure, that has an intrinsic fundamental thermodynamic cost.
Starting point is 00:27:18 When I first learned this, I was thrilled and so surprised because computation and thermodynamics seem like they don't necessarily have to have anything to do with each other, but they're very closely bound up. It's really quite an astonishing idea. It seems like at least some aspects of thermodynamics very much bear on the information science of the future. I mean, if we're going to do quantum computers soon, it sounds like we have to know something about thermodynamics. Did you say Landauer limit or something? Yes, it's sometimes called the Landauer bound or Landauer limit on the amount of work
Starting point is 00:27:57 required. And there is a movement maybe one could say called the thermodynamics of information. movement, maybe one could say, called the thermodynamics of information. And quantum thermodynamics and our friends in adjacent fields really love putting together thermodynamics and information. So, since I mentioned quantum computers, and I'm sure some listeners are wondering, you know, we're doing all these thought experiments with gases in boxes and moving partitions and things. This is not so far from reality. Like, maybe these things could become real at some point. So should we shift gears now to that question of applications? Is there any hope in the near future or the long-term future to make quantum machines
Starting point is 00:28:40 based on these principles or what should we think of in terms of coming applications? We could approach this question in a number of information and they checked how much work they needed to erase the information. You mentioned equilibrium earlier in the conversation. If we keep a system close to equilibrium, we don't disturb it a whole lot. We don't waste energy on riling it up. And so if we perform a process very slowly, since the system is always in equilibrium or close to equilibrium, we can get away with spending relatively small amounts of work. And so they ran their process more and more slowly and saw what is the value of work that seems like just below the amount of work that
Starting point is 00:29:46 the value of work that seems like just below the amount of work that they had to pay, that they were approaching. And they found that their observations were consistent with Landauer's bound. Now this Landauer bound, this minimal amount of work required to erase a bit of information that is pretty far from the amount of work that is actually spent to erase information in today's hardware. We operate quite far from this fundamental limitation. Some people are interested in trying to reach toward this bound to waste less energy. Heating of small parts of computers is a significant problem in both the classical realm of computing and the quantum realm of computing. That's one motivation that people have for studying such fundamental limits. And it does seem that quantum systems also obey an ultimate limit on the work
Starting point is 00:30:36 cost of erasure. But the operations that we can perform, say in computation, can be different. And also, we can ask, as you did earlier, what if we have a quantum gas in our box rather than a classical gas? Does anything change? Quite a few features change. And sometimes we can use quantum phenomena like entanglement as resources to help us out. One of my favorite examples was proved by colleagues including LĂ­dia del Rio, a Portuguese physicist. And they said, suppose that our piece of information that we want to erase is a piece of quantum information stored in a quantum particle that's entangled. So it has this set of strong correlations more or less with another quantum
Starting point is 00:31:25 system. And so there's this kind of reference system, and we can manipulate the entanglement. But what we want to do is erase the quantum information in this particle of interest, while keeping the reference system in its same state. We don't want to disturb it too much. being the reference system in its same state. We don't want to disturb it too much. It turns out that we can erase the particle of interest while on the whole gaining work instead of spending work, which is counterintuitive because we're supposed to spend work in order to erase information.
Starting point is 00:32:00 The trick is to kind of burn the correlations, the entanglement between the particle of interest and the reference in the presence of heat. So entanglement together with heat serves as this kind of thermodynamic quote unquote fuel that we can use to erase while extracting work. This doesn't violate Landauer's principle because Landauer wasn't actually thinking about a quantum particle that's entangled. This entanglement is kind of an extra resource that we're adding to the Landauer story after
Starting point is 00:32:33 making a quantum. So everything's consistent, but it could be a little surprising that we can use entanglement as a resource in this decades old erasure story. Oh, it's so interesting and weird what you just described. So there's a particle in a box, and it's entangled, let's say, with another particle outside the box. Is that right? In this case, it might be useful to think of a different kind
Starting point is 00:32:58 of platform or physical system. In classical information science, we think of bits, I don't know, a transistor in an ordinary computer and codes of bits. It can be in the zero state or the one state. When it comes to quantum information science, the basic unit of information is the qubit, the quantum bits. And we can store a qubit, for example, in a property of an electron. The electron doesn't have to be in a box, but if we bring two electrons together and perform some operation on them, as just one example,
Starting point is 00:33:33 then we can have two qubits that are entangled with each other. Remind me how that goes. What particles, what properties are entangled just to make it concrete? One example is the spin of an electron. So like what particles, what properties are entangled just to make it concrete? So one example is the spin of an electron. Okay. It can store qubits. There are also all sorts of other platforms that people are using to store qubits.
Starting point is 00:34:00 For instance, super connecting qubits is a really tiny circuit printed on a chip. And a current can flow in one direction in the circuit or flow in the other direction. These are kind of the two options that help define a qubit. So maybe now we have two entangled superconducting circuits in your setup. Sure. Okay. And then, but going back to the thing that was blowing my mind, which is that you could somehow burn entanglement to provide. With scare quotes.
Starting point is 00:34:25 Yeah, scare quotes. It is scary. I never, I mean, entanglement is itself so spooky and amorphous in the way that an average person thinks about it that the idea that you could burn it and use it as a resource, this is the first I've ever heard of this idea. Entanglement gets destroyed sort of on its own very commonly, right? That's the reason we're not familiar with it in our ordinary macroscopic lives. So it sounds in a way like what you're describing could be commonplace.
Starting point is 00:34:51 Like if destroying the entanglement is the key, then well, we do that all the time. Merely destroying the entanglement in absolutely any way probably won't do the trick. But if you can control and manipulate the entanglement in the right way, then you can consume it in order to get out your work. I see. So if you manipulate the entanglement in the right way, that in some sense destroys it, it could be destroyed in a productive way. Yes. Huh. And does this whole thought experiment that you're describing now,
Starting point is 00:35:24 are you saying we're close to being able to do this? Or we can at least imagine we can do this? I wouldn't be surprised if someone had tried it. I don't think I've seen a paper, but increasing numbers of quantum thermodynamics experiments are being performed. So I wouldn't be too surprised if someone performed this experiment in the near future. The original theory of thermodynamics went hand in hand with the Industrial Revolution, which was useful in a very different sense. And now people are starting to pivot to try to make quantum thermodynamics, maybe quantum thermal machines, actually useful for us.
Starting point is 00:36:01 And not only really cool curiosities. Yeah, I do think that's a nice, answer that you're giving that so much of the pleasure of this subject seems to be the light it sheds on two very deep subjects, thermodynamics and quantum theory, which continues to be fascinating and seems that by combining those two we're getting even deeper understanding of both fields. It's starting to become imaginable to actually make things using these ideas. I know that you've been interested in something called autonomous quantum machines. Could you tell us a little about that?
Starting point is 00:36:39 What is an autonomous quantum machine in your mind or in reality? KS Sure. Quantum thermodynamics have designed quantum engines, quantum refrigerators, quantum batteries, quantum ratchets, and some of them have been realized experimentally. I think the experiments are very impressive. They demonstrate that the experimentalists have excellent control, and just the idea of making a quantum engine is very fun. On the other hand, you wouldn't want to invest in a company that provides quantum engines because a quantum engine is so small, it outputs very little energy. But cooling a system down so that it behaves in a quantum fashion and then manipulating the engine costs loads of energy. So the engine isn't worth it, except for the fun factor. Now there are, even in classical thermodynamics,
Starting point is 00:37:33 engines and refrigerators, machines that are autonomous. They can run on their own without the need for control that changes over time. You just give the machine access to some energy in its environment. The machine will extract the energy from its environment and do its own thing. We can also make quantum versions
Starting point is 00:37:56 of these autonomous thermodynamic machines. And since autonomous machines don't require lots of control, they offer some hope for actually making useful quantum thermodynamic machines. I recently collaborated with the lab of Simone Gasparanetti at Chalmers University in Sweden. Chalmers University is building a quantum computer from superconducting qubits. These superconducting qubits are in a large classical refrigerator called a dilution refrigerator, which cools the qubits down to, I think, tens of millikelvin. But suppose that this quantum computer has just finished a calculation. It's used some qubits up as scrap paper, which we keep returning to in this
Starting point is 00:38:45 conversation. And if we want to perform the next quantum computation, we need to clean off the scrap paper. And in experimental language, that means we need to cool down the qubits even more in order to reset them even more than this classical refrigerator can manage. classical refrigerator can manage. One can design a chip to stick inside the classical refrigerator that consists of more superconducting qubits that act as an autonomous quantum refrigerator. You can kind of hand over your computational qubits to this quantum refrigerator and let it do its thing, and then take the qubits away and use them in your computation. In order to get the quantum refrigerator to behave in a quantum way, you have to make it cold, but you just stick it inside this classical refrigerator,
Starting point is 00:39:36 which is already cold because you're already performing quantum computations. So the experiment that was done in Simona's lab was a proof of principle experiment, but it performed a lot better than I expected. So we're hoping that it's the beginning of making autonomous quantum machines useful. CB It makes me wonder, what is it about what you do and what you get to think about that brings you joy? JG Good question. I have always loved dealing in abstract ideas. I have always loved reading because that's always given me the opportunity to build universes in my heads. In high school, I was very attracted to philosophy and mathematics, computer science, physics, and so on. computer science, physics, and so on. And quantum thermodynamics gives me joy
Starting point is 00:40:28 because it enables me to play with these abstract ideas. I get to build universes in my head, you know, models of all sorts of different systems for a job. And I get to engage with all these different subjects and all their ideas. But there's also a sense of balance because there are applications of quantum information theory and quantum thermodynamics to quantum technologies like quantum computers and cryptography and so on.
Starting point is 00:40:59 So I get to play in the realm of ideas and also feel like maybe I'm doing something useful. Very impressive. Nice. We've been speaking with theoretical physicist Nicole Younger Halpern about the ins and outs of quantum thermodynamics. It's really been fun, Nicole. Thank you so much for joining us. Thank you.
Starting point is 00:41:19 It's been a lot of fun. Thanks for listening. If you're enjoying the joy of why and you're not already subscribed, hit the subscribe or follow button where you're listening. You can also leave a review for the show. It helps people find this podcast. The Joy of Why is a podcast from Quantum Magazine, an editorially independent publication supported by the Simons Foundation. Funding decisions by the Simons Foundation have no influence on the selection
Starting point is 00:41:51 of topics, guests, or other editorial decisions in this podcast or in Quantum Magazine. The Joy of Why is produced by PRX Productions. The production team is Caitlin Faults, Livia Brock, Genevieve Sponsler, and Merritt Jacob. The executive producer of PRX Productions is Jocelyn Gonzalez. Morgan Church and Edwin Ochoa provided additional assistance. From Quanta Magazine, John Rennie and Thomas Lin provided editorial guidance, with support from Matt Carlstrom, Samuel Velasco, Arlene Santana, and Megan Wilcoxon. Samir Patel is Quanta's editor-in-chief.
Starting point is 00:42:30 Our theme music is from APM Music. Julian Lin came up with the podcast name. The episode Art is by Peter Greenwood, and our logo is by Jackie King and Christina Armitage. Special thanks to the Columbia Journalism School and Burt Odom-Reed at the Cornell Broadcast Studios. I'm your host, Steve Strogetz. If you have any questions or comments for us,
Starting point is 00:42:54 please email us at quanta at simonsfoundation.org. Thanks for listening.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.