12 Comments
User's avatar
Dr. Jasmin Smajic's avatar

“Science fiction isn't primarily concerned with predicting specific inventions like the automobile or the spaceship. Its power lies in exploring the broader forces that drive innovation and the societal transformations that emerge.”

When I was a scientist I was struggling to come up with ideas for research and one of my mentors specifically told me to read science fiction. It opened my mind to the possibilities and I was since hooked.

Expand full comment
The One Percent Rule's avatar

Brilliant example Jasmin, thank you. It is remarkable how many authors of science fiction have studied science and can help us imagine other possibilities in our domains and broader.

Expand full comment
Marginal Gains's avatar

I’ve mostly read sci-fi books published in the last few decades, such as The Martian, Project Hail Mary, and Daemon. I’ve also been a fan of sci-fi movies and series since childhood, including classics like Star Trek, Star Wars, and Contact.

However, I’ve struggled to connect with iconic works like Dune, The Foundation, or the Three-Body Problem series. While I plan to revisit them in the future, for some reason, they didn’t resonate with me as much as the more contemporary sci-fi I’ve encountered.

One of the things I find fascinating about science fiction is its power to inspire real-world innovation. The ideas and technologies we encounter in sci-fi can motivate people to pursue and build them, often advancing human civilization. At the same time, sci-fi also serves as a cautionary tale—it shows us how the wrong ideas or unchecked ambitions can lead to disasters or even annihilation. This duality is one of the genre’s greatest strengths.

That said, blindly pursuing growth and technological advancement without public awareness, involvement, and ethical guardrails is dangerous. It delays the adoption of these innovations and can lead to unintended consequences that may outweigh their potential benefits. The current state of technological development is particularly concerning because a tiny group often decides what to build and prioritize for individuals or companies. Broader societal involvement could lead to better, more thoughtful outcomes, even if it might slow progress in the short term.

We still lack robust governance frameworks, ethical guidelines, and collaborative systems to manage the impact of major emerging technologies like AI, synthetic biology, climate technologies, quantum computing, and space exploration. Without these structures, we’re at the mercy of the motivations and incentives of a select few. While these technologies may benefit humanity in many cases, the risk of unintended consequences remains high, especially over the long term.

This raises an important question: What level of risk are we, as a society, willing to accept in pursuing innovation? Building future technologies without foresight and safeguards might not cause immediate harm, but the long-term consequences could be catastrophic if we’re not careful.

I’m curious about your perspective, particularly in light of your reading. Do you see a noticeable difference between older and newer sci-fi books? Has the focus of the genre shifted over the past few decades?

Expand full comment
The One Percent Rule's avatar

I have not read the 3 body problem yet, I will at some point!

Overall science fiction opens our mind to many possibilities. I have been amazed when I read some of these authors and their ideas. Lem was incredibly foresighted. He nailed LLMs perfectly and also spoke of societal unrest in his book Robots.

With respect to regulations - absolutely many emphasize this. For example:

HG Wells' 1914 science fiction novel, “The World Set Free,” first spoke of the possibility of creating an atomic bomb. He was deeply concerned with regulations and society impact and went on to be the instigator of the UN Declaration of Human Rights and a host of other regulations came from his writing and lobbying. In his Time Traveller he shows teh dangers of split societies.

Fahrenheit 451 by Ray Bradbury. This is a chilling look at a future where books are outlawed and firemen burn any they find. It examines the dangers of censorship, the importance of critical thinking, and the power of knowledge. Brave New World (Aldous Huxley). Huxley brilliantly shows the dark side of a seemingly utopian society where technology is used to control and suppress individual freedom. He raises questions about genetic engineering, social conditioning, and individuality.

I, Robot. Asimov's Three Laws of Robotics are a classic example of trying to establish ethical guidelines for artificial intelligence.

Similarly, The Diamond Age by Neal Stephenson. Deals with nanotechnology, artificial intelligence, and the impact of advanced technology on social structures and education. He raises questions about access to technology, the role of government, and the potential for social upheaval.

Finally, not the best I've read, in terms of writing, but Machinehood by S.B. Divya (has a degree in Computational Neuroscience). A near-future thriller that tackles issues of automation, artificial intelligence, and the rights of workers in a world where machines are increasingly taking over human jobs. Maybe too far reaching and tries to pack in too many themes, but was up for a Hugo award! Yet has not caught on like the others above.

Many more, that show the dangers of technology and how we should regulate it.

Expand full comment
Marginal Gains's avatar

I will find time to read some of these old sci-fi novels this year. Although I wanted to focus on history and AI (non-fiction) this year, I will include some fiction as part of my AI-related reading. As you said above, it will probably help me get a better handle on where we are heading, especially the dangers of technology.

If you have not read the below series, I recommend you to read it and see what year the first book in the series was published before you read it so you know some things mentioned in it were ahead of time:

https://www.amazon.com/dp/B074CDHK46?binding=kindle_edition&searchxofy=true&ref_=dbs_s_aps_series_rwt_tkin&qid=1736785293&sr=8-1

Expand full comment
The One Percent Rule's avatar

Thank you. Ordered the 2 books :-) I agree, history, science, non fiction AI are my main reads, science fiction are generally gap fillers, but they are equally thought provoking.

Expand full comment
Marginal Gains's avatar

The one subject I need to work on further in science is biology. I earned my bachelor's degree in Physics, Chemistry, and Mathematics, and I can handle these three well. The last time I studied Biology was in 8th Grade, especially neuroscience since I think some of the ideas to solve the last 10% of the edge cases will come from it.

Expand full comment
The One Percent Rule's avatar

I like Sapolsky for this - his book Behave is great. And his lectures are first class - over 18 million views on the first one here https://www.youtube.com/watch?v=NNnIGh9g6fA

The series of lectures are well worth watching if you have not.

Expand full comment
Marginal Gains's avatar

I know of him and have seen a few of his videos but have not read his book.

Expand full comment
Nathan White's avatar

Fahrenheit 451! My all-time favorite piece of fiction.

Expand full comment
The One Percent Rule's avatar

Now that is a classic - great reminder, thank you Nathan.

Expand full comment