Donald Trump poses a huge dilemma for commentators: to ignore his daily outrages is to normalize his behavior, but to constantly write about them is to stop learning. Like others, I struggle to get this balance right, which is why I pause today to point out some incredible technological changes happening while Trump has kept us focused on him — changes that will pose as big an adaptation challenge to American workers as transitioning from farms to factories once did.

Two and half years ago I was researching a book that included a section on IBM's cognitive computer, "Watson," which had perfected the use of artificial intelligence enough to defeat the two all-time "Jeopardy!" champions. After my IBM hosts had shown me Watson at its Yorktown Heights lab, they took me through a room where a small group of IBM scientists were experimenting with something futuristic called "quantum computing." They left me thinking this was "Star Wars" stuff — a galaxy and many years far away.

Last week I visited the same lab, where my hosts showed me the world's first quantum computer that can handle 50 quantum bits, or qubits, which it unveiled in November. They still may need a decade to make this computer powerful enough and reliable enough for groundbreaking industrial applications, but clearly quantum computing has gone from science fiction to nonfiction faster than most anyone expected.

Who cares? Well, if you think it's scary what we can now do with artificial intelligence produced by classical binary digital electronic computers built with transistors — like make cars that can drive themselves and software that can write news stories or produce humanlike speech — remember this: These "old" computers still don't have enough memory or processing power to solve what IBM calls "historically intractable problems." Quantum computers, paired with classical computers via the cloud, have the potential to do that in minutes or seconds.

Look at where we are today thanks to artificial intelligence from digital computers — and the amount of middle-skill and even high-skill work they're supplanting — and then factor in how all of this could be supercharged in a decade by quantum computing.

It's why IBM's CEO, Ginni Rometty, remarked to me in an interview: "Every job will require some technology, and therefore we'll need to revamp education. The K-12 curriculum is obvious, but it's the adult retraining — lifelong learning systems — that will be even more important."

Each time work gets outsourced or tasks get handed off to a machine, "we must reach up and learn a new skill or in some ways expand our capabilities as humans in order to fully realize our collaborative potential," said education to work expert Heather McGowan said.

I didn't mean to distract from the "Trump Reality Show," but I just thought I'd mention that "Star Wars" technology is coming not only to a theater near you, but to a job near you. We need to be discussing and adapting to its implications as much as we do Trump's tweets.

Thomas L. Friedman writes for The New York Times.