Whale Oil, AI, and the Future Focus of Cyber
There was a point in our history when the world ran on whale oil. It was used to lubricate machinery, provide lighting, and powered the industrial revolution. Within the span of a few years that technology collapsed and was rapidly replaced by new tech. I believe we are seeing the beginning of a similar tectonic shift.
Long ago, indigenous people hunted whales for food, tools, and used the oil for heating and lighting. In the 1500s, Basque sailors from Spain and France were some of the first Europeans to engage in commercial whaling, primarily in the Bay of Biscay. In the early 1600s the Dutch and English established whaling operations in Arctic regions like Svalbard and around 1672 colonists began using drift whales that washed ashore in New England. In the 1700s whaling expanded to the deep-sea in the Atlantic and places like Nantucket and New Bedford Massachusetts became major centers of the whaling industry (the Silicon Valley of whaling so to speak).
The American whaling industry reached its peak between 1835 and 1860 with over 700 ships engaged in the work and Melville's Moby Dick brought the significance of whaling to the forefront of culture in 1851.
Then, in 1859, it all changed.
Edwin Drake drilled the first successful commercial oil well in Titusville, Pennsylvania marking the birth of the American petroleum industry. The American Civil War further disrupted whaling, partially due to Confederate raiders attacking Union whaling ships, but also due to the engagement of men in the war and away from whaling.
Kerosene, derived from petroleum, became the dominate fuel for lighting, leading to a sharp decline in whale oil demand, and then in 1886 electricity became widespread with the introduction of electric street lighting, further reducing the need for whale oil.
By the early 1900s the American whaling fleet had diminished significantly with only a few ships remaining in operation, focused more on baleen and whale bone rather than oil.
The last American whaling ship from New Bedford named the Charles W. Morgan ended its final voyage in 1924. The whaling industry shifted to countries like Japan, but in America it was dying. Then the League of Nations and the International Whaling Commission began to regulate the industry and implement conservation programs due to over hunting and environmental damage. By 1971 the US had ceased all commercial whaling activities.
Whaling and whale oil was a cornerstone of the global economy for over 300 years, but when it died, it died fast. Its decline lead to widespread unemployment and economic hardship. The culture itself shifted. Many different groups were impacted from sailors, to shipbuilders, oil processing factory workers, merchants, and venture investors. Whole towns experienced economic downturns, similar to what happened to the Auto Industry in Detroit, which in turn led to migration as people sought work in other industries and regions.
There were no formal retraining programs, either industry or government lead. Some workers adapted by getting into apprenticeships and on the job training in adjacent growing industries like building steam vessels or railroads.
Other Major Industrial Shifts
We have seen many other major technologically driven industrial shifts with similar impacts on workers and communities, and each time humanity has adapted, innovated, and moved with the change, even though some people were lost and left behind. Some examples include:
- The Decline of the Ice Harvesting Industry
- Transition from Horses to Automobiles
- Decline of the Telegraph and Rise of the Telephone
- Shift from Sail to Steam Powered Ships
- Move from Coal to Oil and Natural Gas
- Decline of Textile Mills in Developed Countries
- Digital Revolution in Photography
- Decline of Video Tape and Rental Industry
- Automation & Robotics in Manufacturing & Move Overseas
Each of these shifts resulted in job losses, cultural changes, migration, and economic upheaval. However, they also lead to technological advancements, innovation, adaptation, and huge growth in new job opportunities. Workers who can adapt and roll with the changes often do very well, even thrive.
AI And Ambient Technology as the new Petroleum
The advent of Large Language Models and ambient tech such as Meta's Orion AR glasses represent the beginnings of one of these major shifts. Drake (In the form of OpenAI, NVIDIA, Meta, et all) has drilled the first well of the modern era.
You might be thinking that we have seen multiple failures of wearable tech, from Google Glasses to Metas adventures in VR, but remember, Microsoft had a chunky, unpopular tablet in the 90s, over a decade before the iPad was released. Some iteration of this technology will take hold, maybe an injectable neurolink, maybe a holographic projection from a watch, IDK.
We have seen LLMs disappoint in many ways, but the mass implementation of this tech is extremely new. It took almost 100 years from Drake's well to the 1971 ceasing of commercial whaling, but the change was inevitable.
I believe we will move away from large teams of programmers and supporting staff, to small, close-knit teams of highly creative individuals who focus on what to build and why, rather than on syntax and detailed coding. A shift away from large scale hiring and hypergrowth in terms of staffing, and towards elite, low overhead but very well compensated, innovation units.
I see a lot of posts about who disappointed LLMs are as a coding assistant, but I think that's because people haven't learned how to effectively integrate them into the process yet. Here is what I have learned:
- Not all chat LLM interfaces are good for coding.
- Not all models are optimized for coding.
- Prompt engineering is extremely important for code assist. If done incorrectly the results are really bad.
- Context size has a huge impact. 6 months to a year ago the LLMs would lose context and start introducing bugs, forgetting previous code implementations, and giving conflicting advice. As context size increases, coding outcomes get better and better.
- Most people aren't "agentizing" yet and integrating the AI into their IDEs, build, and test environments, this is where it will start to really shine.
- LLM's are pretty code at accelerating low hanging fruit debugging, quality improvements, and planning. A human is needed to refine and polish the output, but you can replace several agile team members fairly readily.
- Learning. LLMs are helping me learn to code much faster than any other approach I've taken. For example, I often learn about features in python I can apply to specific corner cases that I've never seen on a Stack Overflow post.
- Break it up. First I have an LLM help me outline the project and break it up into tasks. Then I break it up into the functions or objects that will be needed to complete the tasks. Then I use the LLM to help me implement a singular function, while maintaining the context of the whole project. Take smaller bites rather than huge ones.
Another major shift caused by AI is in the energy and hardware industries. Much like those who sold shovels to miners during the gold rush made a huge amount of the wealth, those who sell GPUs, data centers, and energy to AI companies will experience huge growth and demand. Its already happening.
Cyber's New Focus
In my career, I've seen whole classes of bugs die. Format String vulnerabilities immediately come to mind. SQL injection is fading due to improvements in DB libraries. With the advantage of memory safe languages like Rust and C#, memory corruption exploits are becoming more difficult.
So how will cyber look in this new industrial revolution?
When I started, network penetration tests with a focus on remote memory exploits as well as manual attack investigations was the core of cyber. Then firewalls, IDS, SEIM, and endpoint protection came along. After that we saw the rise of CTI and SDLC. Now there is a huge focus on standards, policy, procedure, and governance. Already in my career I have seen this go from non-existent, to arguably the largest area of the field. The implementation of these things will increasingly be handled by AI, but the design will be tailored to each business by humans.
Vulnerability triage. This is something that I think will be huge. Imagine you use an AI to help build an automated fuzzing system as a part of your SDLC. Now imagine another AI / agent performing automated crash analysis for you, and a third automating testing and instrumentation. (All things we are working on.) Things are going to get very interesting.
I think there will be a larger focus on how things are implemented and hook together, trusts, etc., rather than on bad coding issues.
Out of band protocol issues. IoT is growing rapidly and spans everything from cars, to spacecraft sensors, to refrigerators, to electronic warfare involving drones. This becomes less about vulnerabilities in software, and more about communication protocols, RF, and sideband effects.
I imagine cyber diverging into several core streams of focus:
- Individuals who are good at thinking about the enterprise, legal and regulatory requirements, NIST standards, and driving customized implementations.
- Highly specialized individuals, enhanced by AI tools, who focus on looking at chips, protocols, and hardware / firmware level issues.
- AI and Quantum specialists who either attack those types of systems, or use them to enhance attacks on more traditional technologies.
I could be biased because 2. and 3. are my areas of interest, so there may be others I am missing. Feel free to correct me in the comments.
Either way we will be seeing massive change coming over the horizon, and it will require resilience and adaptability to avoid going the way of an 1800's whale hunter.
What do you think will be the Drake well of cyber?
A.