Tag Archives: Film

THE CRAZY LIFE AND DEATH OF HOWARD HUGHES

Howard Hughes was a man who could design and test-fly an airplane, direct a movie, seduce a starlet, buy casino hotels, disappear for years, and still make headlines without showing his face. He was as much a symbol of American ambition as he was a cautionary tale of what unchecked wealth, genius, and madness can do to a man. Born into privilege, fueled by obsession, and haunted by demons, Hughes lived a life so extreme that it bordered on mythology. But his death—quiet, grim, and mysterious—might be stranger than the intense living that led to it. Here’s the drama of the crazy life and death of Howard Hughes.

To understand his end, we have to rewind to the beginning of a life lived on the edges of brilliance and breakdown. Howard Hughes was many things: inventor, aviator, filmmaker, billionaire, recluse, suspected intelligence asset, and perhaps most tragically, a prisoner of his own mind.

He died aboard a private jet, his six-foot-four frame weighing only ninety pounds, unrecognizable even to those who’d once worshipped him. The official version says kidney failure. But the deeper you dig, the more the story starts to crack. It was a death as strange as his life—one that still casts a long shadow.

Howard Robard Hughes Jr. was born on December 24, 1905, in Humble, Texas, into a family drenched in oil money. His father, Howard Sr., invented the Hughes rotary drill bit and founded the Hughes Tool Company, which would bankroll young Howard’s endless stream of curiosities and obsessions. By age 11, he built Houston’s first wireless radio transmitter. At 12, he constructed a motorized bicycle from scrap parts. By 14, he was designing working aircraft models in his room. But early brilliance often walks hand in hand with isolation.

Tragedy struck fast and deep. His mother Allene died when he was just 16—reportedly from complications of an ectopic pregnancy. His father died suddenly two years later from a heart attack. At 18, Hughes was a billionaire orphan with complete control over the Hughes Tool fortune. No advisors. No parental guidance. Just money, ambition, and a ticking mind that was already showing cracks.

He dropped out of Rice University and headed west to Los Angeles. Hollywood in the 1920s was wild, wide open, and vulnerable to someone like Hughes: rich, eccentric, and hungry to create. His first film, “Swell Hogan,” was a bomb. But he rebounded with Hell’s Angels, an over-the-top war epic that cost $4 million, used real WWI aircraft, and took three years to complete. Hughes delayed filming repeatedly, waiting for perfect cloud formations to shoot aerial scenes. That level of obsessive control would become his hallmark.

He followed up with The Outlaw (1943), mostly remembered for its promotional posters featuring Jane Russell’s cleavage. Hughes engineered a custom bra for her, designed to lift and frame her bustline more dramatically under studio lights. While Russell later claimed she never wore the thing, Hughes’s reputation as a hyper-controlling, detail-obsessed innovator was sealed. He didn’t just direct movies—he reimagined how to shoot them.

But filmmaking was just the opening act. Hughes’s true passion—perhaps his purest love—was aviation. In 1935, he set a world airspeed record flying the Hughes H-1 Racer. In 1938, he flew around the globe in 91 hours, earning him a ticker-tape parade in New York and a congratulatory telegram from President Franklin D. Roosevelt. His company, Hughes Aircraft, exploded into a major defense contractor, developing radar systems, missiles, and later, aerospace technology. He personally test-piloted many of the prototypes—sometimes successfully, sometimes not.

The worst crash came in 1946 while piloting the XF-11 reconnaissance plane over Beverly Hills. He clipped telephone wires and crash-landed in a residential area, destroying several homes. He broke dozens of bones, suffered third-degree burns, and nearly died. He was pulled from the wreckage by a U.S. Marine who happened to live nearby. The physical pain lingered for the rest of his life. So did the emotional trauma.

This is the crash that many believe began driving Howard Hughes crazy.

He emerged from the hospital addicted to morphine, codeine, and later Valium. But the painkillers didn’t just numb the physical agony—they dulled the sharp edges of a mind that was becoming unhinged. He began displaying symptoms that today would be clearly diagnosed: Obsessive-Compulsive Disorder (OCD), Post-Traumatic Stress Disorder (PTSD) from repeated crashes, Traumatic Brain Injury (TBI) from head trauma, and likely undiagnosed neurosyphilis, which can cause hallucinations and severe personality changes in its late stages.

He began spiraling. He became consumed with hand-washing rituals that lasted hours. He insisted on sealed containers for his food. He wrote memos detailing the precise number of tissues someone should use when handling a document. He refused to be touched. And then, gradually, he refused to be seen at all.

By the 1950s, Hughes disappeared from public life. He moved into the Desert Inn hotel in Las Vegas and refused to leave. When the owners threatened eviction, he bought the hotel. Then he bought more—four additional Vegas properties, including the Sands and the Frontier. He watched the city from behind blackout curtains while seated naked in a chair, surrounded by jars of his own urine. He ate the same meal—TV dinners, Hershey bars, and whole milk—every day. For months at a time, he wouldn’t speak. He communicated through written notes. Many were borderline incoherent.

He trusted only a small inner circle of Mormon aides—dubbed the “Mormon Mafia.” These men controlled access to Hughes. They decided who could speak to him, when medications were administered, and even, allegedly, which documents he signed. Whether they were loyal caretakers or self-serving gatekeepers is still up for debate. Some say they protected him. Others believe they manipulated him for their own ends.

Meanwhile, Hughes was still making moves. His influence extended far beyond real estate and film. His company, Hughes Aircraft, was a key contractor for the U.S. government. In 1974, it was revealed that the CIA used Hughes’s name and company to build a deep-sea vessel—the Glomar Explorer—to recover a sunken Soviet submarine. The operation, known as Project Azorian, remains one of the most ambitious and secretive intelligence operations in history. Hughes’s name gave the cover story credibility. It also gave the CIA plausible deniability.

Hughes’s political entanglements didn’t stop there. He had longstanding financial connections to powerful people—most notably Richard Nixon. It’s widely believed that Hughes funneled large sums of money through intermediaries like Bebe Rebozo, a close Nixon ally. Some even argue that the 1972 Watergate break-in was partly motivated by a desire to retrieve sensitive documents linking Nixon to Hughes. Though never definitively proven, the rumors persisted and added another shadow to Hughes’s legacy.

And through it all, he was deteriorating—mentally, physically, and emotionally.

His fingernails grew inches long and curled under themselves. His toenails cracked and yellowed. He refused to bathe or cut his hair. He developed allodynia, a condition where even a soft touch causes extreme pain. He wore Kleenex boxes on his feet and sat naked for days at a time in darkened rooms, watching old movies on repeat. He feared germs, radiation, and even sunlight. His world shrank to a few rooms and a few carefully controlled interactions. He had gone from a bold aviator and innovator to a whisper behind a hotel room door.

In 1972, author Clifford Irving sold a fake Hughes autobiography to publisher McGraw-Hill. Irving claimed he had conducted secret interviews with Hughes. The hoax unraveled spectacularly when Hughes—out of hiding—called in to a press conference and publicly denied any involvement. The voice was unmistakably his. It was the last time the world would ever hear it.

In his final years, Hughes drifted from hotel to hotel, city to city: Managua, Vancouver, Acapulco, London. He traveled by private jet, hidden away, often sedated. His last known photograph is debated. Even his closest aides gave conflicting accounts of where he was at any given time.

On April 5, 1976, Howard Hughes died aboard a chartered Learjet, 30,000 feet over New Mexico, en route from Acapulco to Houston’s Methodist Hospital. He was pronounced dead at 1:27 a.m. The official cause: kidney failure. But when his body was examined, doctors were shocked. He weighed just 90 pounds and had shrunk more than four inches in height. His hair and beard were matted and uncut. His fingernails were several inches long. His skin was covered in sores. He was so unrecognizable, the FBI had to use fingerprints to identify him.

The coroner declared natural causes. But an 18-month private investigation painted a more disturbing picture. According to their report: “Persons unknown intentionally administered a deadly injection of codeine painkiller to this comatose man—obviously needlessly and almost certainly fatal.”

Was it euthanasia? Murder? A mercy killing? Or just gross negligence? We’ll likely never know. But Hughes’s legacy was immediately thrown into chaos. There was no clear will. Dozens of people claimed to have one. Most were forged. One, presented by gas station attendant Melvin Dummar, claimed Hughes had left him $156 million. It was ruled a fake, but the story became the basis for the film Melvin and Howard.

Even in death, Hughes was a myth waiting to be rewritten.

His Howard Hughes Medical Institute—originally established as a tax shelter—became one of the largest and most respected biomedical research organizations in the world. His story inspired books, films (The Aviator among them), and countless conspiracy theories. He remains one of the most complex, contradictory figures in American history.

So, what drove Howard Hughes crazy?

It wasn’t just the painkillers. Or the isolation. Or the crashes. It was the collision of genius without limits, power without oversight, and a mind without rest. He was a man of staggering vision—who could imagine worlds that hadn’t yet been built—but also a man whose compulsions devoured him from the inside out. He chased perfection in everything: flight, film, business, beauty. And perfection, for Hughes, was always just one more note, one more tweak, one more cleaning away.

He died not just from kidney failure—but from the failure of a peripheral support system that let a brilliant man collapse into exponential madness behind closed doors.

This is the real Howard Hughes—the boy genius, the master builder, the spy asset, the germ-fearing recluse, the paranoid mogul, and the man whose life and death still stir questions we may never answer.

And this was the crazy life and death of Howard Hughes.

Powered by a Centaur Intelligence System  centaursystems.ai

NON-ARTIFICIAL INTELLIGENCE

Now that the balloon has popped on failed fads like Dot.Coms, Bored Ape NFTs, Crypto, and forever-free borrowed money, the world’s current FOMO (Fear Of Missing Out) has turned to the newest and coolest cat—Artificial Intelligence or what’s simply called AI. Make no mistake, AI is real. It’s not simple, but it’s very, very real. And it has the potential to be unbelievably good or gut-wrenchingly awful. But as smart as AI gets, will it ever be a match for Non-Artificial Intelligence, NAI?

I can’t explain what NAI is. I just have faith that it exists and has been a driving force in my life, especially my current life where I’m absorbed in a world of imagination and creativity. Call it make-believe or living in a dream, if you will, but I’m having a blast with a current fiction, content-creation project which uses both AI and NAI.

I’ve asked a lot of folks—mainly writing folks because that’s who I hang with—what their source of inspiration is. Their muse or their guide to the information pool they tap into to come up with originality. Many casually say, “God.”

I don’t have a problem with the concept of God. I’ve been alive for 66 years and, to me, I’ve seen pretty strong evidence of an infinite intelligence source that created all this, including myself. I’ll call that force NAI for lack of a better term.

What got me going on this AI/NAI piece was three months of intensive research into the current state of artificial intelligence—what it is, how to use it, and where it’s going. AI is not only a central character in my series titled City Of Danger, AI is a tool I’m using to help create the project. I’m also using Non-Artificial Intelligence as the inspiration, the imagination, and the drive to produce the content.

If you’ve been following DyingWords for a while, you’re probably aware I haven’t published any books in the past two years except for one about the new AI tool called ChatGPT. That’s because I’m totally immersed in creating City Of Danger in agreement with a netstream provider and a cutting-edge, AI audio/visual production company. Here’s how it works:

I use my imagination to create the storyline (plot), develop the characters and their dialogue, construct the scenes, and set the overtone as well as the subtext theme. I use NAI for inspirational ideas and then feed all this to an AI audio/visual bot who scans real people to build avatars and threads them through a “filter” so the City Of Danger end-product looks like a living graphic novel.

Basically, I’m writing a script or a blueprint so an AI program can take over and give it life. The AI company does the film work and the netstream guy foots the bill. This is the logline for City Of Danger:

A modern city in existential crisis caused by malevolent artificial intelligence enlists two private detectives from its 1920s past for an impossible task: Dispense street justice and restore social order.

Here’s a link to my DyingWords web page on City Of Danger along with the opening scene of the pilot episode. Yes, it involves time travel and dystopian tropes which have been done to death—but not quite like this. I like to think of myself as the next JK Rowling except I’m not broke and don’t write in coffee shops with a stroller alongside.

I was going to do this post as a detailed dive into the current state of artificial intelligence and where this fascinating, yet intimidating, technology is going. However, I have a long way to go yet in my R&D and don’t have a complete grasp on the subject. I will give a quick rundown, though, on what I’ve come to understand.

The term (concept) of artificial intelligence has been around a long time. Alan Turing, the father of modern-day computing and its morph into AI, conceived a universal thinking machine back in WW2 when he cracked Nazi communication codes. In 1956, a group of leading minds gathered at Dorchester University where, for three months, they brainstormed and laid the foundation for future AI breakthroughs.

Fast forward to 2023 and we have ChatGPT version 4 and a serious, if not uncontrollable, AI race between the big hitters—Microsoft and Google. Where this is going is anyone’s guess and recently other big guns like Musk, Gates, and Wozniak weighed in, penning an open letter to the AI industry to cool their jets and take the summer off. To quote Elon Musk, “Mark my words, AI is far more dangerous (to humanity) than nukes.”

There’s huge progress happening in AI development right now. But stop and look around at how much AI has already affected your life. Your smartphone and smartTV. Fitbit. GPS. Amazon recommends. Siri and what’s-her-name. Autocorrect. Grammarly. Cruise missiles, car parts, and crock pots.

Each day something new is mentioned. In fact, it’s impossible to scroll through a newsfeed with the AI word showing up. We’re in an AI revolution—likely the Fourth Industrial Revolution to steal the phrase from Klaus Schwab and his World Economic Forum.

Speaking of an AI revolution, one of the clearest runs at explaining AI in layman’s terms is a lengthy post written and illustrated by Tim Urban. It’s a two-part piece titled The AI Revolution: Our Immortality or Extinction. Tim calls AI “God in a Box”. Here’s what ChatGPT had to say about it.

Tim Urban’s two-part post “The AI Revolution: Our Immortality or Extinction” explores the potential impact of artificial intelligence (AI) on humanity.

In part one, Urban describes the current state of AI, including its rapid progress and the various forms it can take. He also discusses the potential benefits and risks of advanced AI, including the possibility of creating a “superintelligence” that could surpass human intelligence and potentially pose an existential threat to humanity.

In part two, Urban delves deeper into the potential risks of advanced AI and explores various strategies for mitigating those risks. He suggests that developing “friendly AI” that shares human values and goals could be a key solution, along with establishing international regulations and governance to ensure the safe development and use of AI.

Overall, Urban’s post highlights the need for thoughtful consideration and planning as we continue to develop and integrate AI into our lives, in order to ensure a positive outcome for humanity.

From what I understand, there are three AI phases:

  1. Narrow or weak artificial intelligence—where the AI system only focuses on one issue.
  2. General artificial intelligence—where the AI system is interactive and equal to humans.
  3. Super artificial intelligence—where the AI system is self-aware and reproducing itself.

We’re in the narrow or weak phase now. How long before we reach phase two and three? There’s a lot of speculation out there by some highly qualified people, and their conclusions range from right away to never. That’s a lot of wriggle room, but the best parentheses I can put on the figure is 2030 for phase two and 2040 for phase three. Give or take a lot.

The AI technology involved in City Of Danger is a mid-range, phase one product. The teccie I’m talking to feels it’ll be at least 2025 before it’s perfected enough to have the series released. I think it’s more like 2026 or 2027, but that’s okay because it gives me more time to tap into NAI for more imaginative and creative storyline ideas.

I’m not going to go further into Narrow AI, General AI, or Super AI in this post. I’d have to get into terms like machine learning, large language model, neural networks, computing interface, intelligence amplification, recursive self-improvement, nanotech and biotech, breeding cycle, opaque algorithms, scaffolding, goal-directed behavior, law of accelerating returns, exponentiality, fault trees, Boolean function and logic gates, GRIND, aligned, non-aligned, balance beam, tripwire, takeoff, intelligence explosion, and that dreaded moment—the singularity. Honestly, I don’t fully understand most of this stuff.

But what I am going to leave you with is something I wrote about ten years ago when I started this DyingWords blog. It’s a post titled STEMI—Five Known Realities of the Universe. Looking back, maybe I nailed what Non-Artificial Intelligence really is.

THE FUTURISTIC FILM INDUSTRY

The future is coming fast—especially in the film industry.  Some of it’s already here. Augmented and virtual reality. CGIs. Digital recreation. Algorithmic editing. Edge computing. 5G/6G networks. Cloned voices. Scanned actors. Non-real celebrities. Drones. Artificially intelligent screenwriting. Remote filmmaking. 3D printed sets. 3D previsualization. Real-time rendering. Sound and light tech breakthroughs. DJI Ronin 4D 6K condensed cinematic lenses. Micro cameras. Avatars & holograms. Blockchain, crypto & NFTs. The Internet of Things (IoT). And, of course, the Metaverse.

The global film industry is huge. It’s astoundingly enormous, and it’s growing massively. According to a study by Globe Newswire, the worldwide film industry grew from $271.83 billion (US) in March 2021 to $325.06 billion in March 2022. That’s a compounded annual growth rate (CAGR) of 11.4% indicating in another four years, 2026, the film-making world will generate 479.63 billion dollars. By the end of this decade, it could be worth a trillion.

If you’re a regular DyingWords follower, you might’ve noticed I haven’t published a book in nearly two years. That’s because I’m immersed in the film industry—studying screenwriting, producing film content under my new company Twenty-Second Century Entertainment (22 ENT), and generally learning what this business is about. I’ve also done on-camera work as a crime and forensic resource in non-scripted documentaries that flowed from blog posts I’ve created. Plus, I’ve made some great filmmaking friends who are teaching this old dog new tricks.

Before I expand on future film technology, I’ll give you a snapshot of what I’ve got on the go. My eight-part Based-On-True-Crime book series is contractually optioned by a producer who has it before a major film company. If this gets “Green Lit”, we have a total of thirty episodes loglined under the working title Occam’s Razor. My hardboiled, private detective storytelling concept called City Of Danger is a twenty-four-part series with a right-of-first-refusal agreement through a leading netstreamer. (See my webpage for City Of Danger—scheduled for 2024). The Fatal Shot is a film production “treatment” I wrote which is being “shopped around”, and I’m collaborating with a long-time colleague on a very interesting screen project titled Lightning Man that I believe has excellent film potential.

Enough of my BS. Let’s look at the futuristic film industry.

Everyone’s talking about the metaverse. Especially Mark Zuckerberg who rebranded Facebook into Meta. He’s betting big that this is Internet 3.0 and, from what I know, I’m sure he’s right even though he can’t get Apple to form a joint venture.

The term metaverse isn’t new. It’s been around three decades and was once known as cyberspace. Although the metaverse is already here and in its infancy or at an inflection point, it’s a hard concept to wrap your head around. Maybe it’s best to let Mr. Zuckerberg explain:

“The “metaverse” is a set of virtual spaces where you can create and explore with other people who aren’t in the same physical space as you. You’ll be able to hang out with friends, work, play, learn, shop, create and more. It’s not necessarily about spending more time online — it’s about making the time you do spend online more meaningful. The metaverse isn’t a single product one company can build alone. Just like the internet, the metaverse exists whether Facebook is there or not. And it won’t be built overnight. Many of these products will only be fully realized in the next 10-15 years. While that’s frustrating for those of us eager to dive right in, it gives us time to ask the difficult questions about how they should be built.”

Zuckerberg says the metaverse is the mobile web’s successor. First there was Internet 1.0 which was static. You could surf the pages and send emails on a desktop. Internet 2.0—where we’re at now—is mobile. It’s smartphone streaming and TikToking. If you want to call the metaverse Internet 3.0, then you need to use compatible words like immersive, interoperable, and integrated. It’s a world of shared virtual experience that can happen at home, on the go, and wherever you are with a connected device.

What the metaverse holds for the film industry is not so much technical advances in production. It’s deliverability and viewer experience. The metaverse won’t be the place you’ll be watching a movie. It’s where you’ll be fully interacting with your five senses—sight, sound, small, taste, and feel. It’ll be like you’re right there in the middle of the set.

If you’re interested in learning more about the metaverse, here are three resources I recommend:

The Metaverse: And How it Wil Revolutionize EverythingBook by Matthew Ball

Value Creation in the Metaverse 76-page pdf by McKinsey & Company

What is the Metaverse?Article at Government Technology

There are two evolving technologies that’ll give you that immersed feeling. One is augmented reality (AR). The other is virtual reality (VR). There’s a big difference between the two immersive platforms.

Augmented reality is enhancing, or augmenting, real events with computerization. AR morphs the mundane, physical world into a colorful, visual place by projecting visual images and characters into an existing framework. It adds to the user’s real-life experience.

Virtual reality creates a world that doesn’t exist and makes it seem very, very real. Think the movie Avatar. VR also incorporates sensory-improving devices like goggles, helmets, headsets, and suits.

You could say computer-generated imagery, or CGIs, is old technology and not something futuristic. You’d be wrong. Advancements in CGI development are nothing short of breathtaking. The CGIs five years from now will make today’s stuff look like a preschooler’s drawing.

Technology’s ability to recreate faces, bodies, and even dialogue is dramatically improving. It’s progressing to the point where it’ll be possible to make an exact replica of just about anyone. Would you like to meet a completely believable Elvis Presley? How about Marilyn Monroe?

Speaking of Elvis and Marilyn, cloned voices are becoming the thing. Computerized synthetization takes old audio of past people and recreates their voices into a life-like state. This process will use artificial intelligence (AI) to build a smoky Marilyn or a crooning Elvis and respond to printed dialogue. It like the current AI text-to-speech but on steroids.

We can’t talk about futuristic filmmaking without bringing up artificial intelligence. AI is moving ahead at lightning speed and it’s bringing the film industry with it. I’m fascinated with AI developments. But I’m also a bit fearful. Here’s a DyingWords post I wrote a while back titled Helpful or Homicidal — How Dangerous is Artificial Intelligence (AI)?

One thing about AI I’m really looking forward to in the film industry is this: Artificially Intelligent Screenwriting. If you’ve ever written, or have tried to write, a screenplay, then you appreciate how much work and effort goes into it, never mind the brain drain of creating unique content.

Recently, researchers at New York University built an artificial intelligence screenwriting program. They called it Benjamin who, among other things, wrote an original soundtrack for its movie after being programmed with 30,000 songs in its data input drive. Can you imagine the 2025 Academy Awards, “And the Oscars for best screenplay and soundtrack goes to… Benjamin the Bot.”

AI isn’t just real in script and score writing. Virtual actors and non-real celebrities are on the way in. It’ll soon be possible to select the movie cast and digitally scan them, then recreate their entire actions throughout the film without them being physically present. It’s well within the realm of possibility to have a virtual Ryan Reynolds or Anne Hathaway act their parts while the flesh and blood realities sit at home. After being paid a substantial sum for licensing their images, of course.

Turning real people into realistic avatars or digital images of themselves is a current technology. Take a look at the leading lady on my City Of Danger promo poster. That’s a real person (a stunningly attractive and stylish, high-status lady, by the way) who was scanned and run through a NextGen Pixlr filter. The plan for City Of Danger is to digitize the cast and set them loose in virtual reality following the human-written episodic scripts translated by AI. Fun stuff!

Drones are fun stuff, too. What used to be aerial filmed with helicopters and airplanes is now drone territory. Drones are far cheaper and much safer. With highly sophisticated controls and cameras, filming by drones will mostly replace piloted vehicles. Take a look at this drone footage of the new Vancouver Island Film Studios, twenty minutes north of my place: https://youtu.be/aTsyRrROx34

Remote filmmaking will put a big dent into on-site producing. With huge advances in film technology, internet sharing, and cost-cutting, more and more productions will happen on sound stages like the six built at Vancouver Island Film Studios. It’s realistic that a director—yes, a real person—will do their work remotely. Instead of fighting traffic and flight delays, a filmmaker will be able to do their job sitting on a yacht in the Maldives and direct their work in the metaverse.

3D printed sets are soon to be here, if not right now. It’s going to be far more efficient to create film set artifacts rather than source them. Those 3D objects can also be scanned and set into virtual reality situations.

3D filming has come a long way since the days audiences sat watching The Power Of Love back in 1922 and wearing those goofy glasses. Now, we have up-close 3D on the laptops and soon to be glasses-free for the big screen. But the big wait for is 4D filming, and it’s a promise to come through VR in the metaverse. Instead of only seeing height, width, and length, you’ll experience depth. You’ll be inside the picture—on the inside looking out at the 3D world.

There are massive changes coming in cameras, sound recording, and lighting effects. Have you seen Top Gun Maverick? That is amazing work, and that’s just the next step in futuristic filmmaking. And you know what? Very little was done through CGIs. It’s just super sophisticated camera, sound, and lighting effects. Here’s how they did it: https://www.indiewire.com/2022/06/top-gun-maverick-making-of-cockpit-1234729694/

Top Gun Maverick used a Sony Rialto Camera Extension System. Yes, it’s expensive but so were renting the jets at over $11,000 per flying hour. More reasonable in my upcoming league is the no-longer-futuristic DJI Ronin 4D $-Axis 6K Cinematic Camera that recently came online at $9,000.00, and that’s just for the lens. Think about it—a 4D, 6,000-pixel digital camera. There isn’t a 6K monitor yet made, but I bet it’s on its way.

Micro cameras have amazing potential. The future is wide open in melding nanotechnology with filmmaking. I can’t imagine what’s happening at the molecular level.

I can imagine, however, what’s happening in the post-production level. It’s not just screenwriting, casting, set building, and cinematography that takes time and money. Editing is a huge time suck in the filmmaking process. What’s just arriving is algorithmic film editing. This is AI software that thinks through the film data and makes automatic jump cuts at precisely the right moment.

Have you heard of edge computing? I hadn’t until I began investigating the futuristic film industry. Edge computing is capturing data at its source and not having to upload it to a server for processing. That eliminates having to use an expensive and laggy “middle-man” like a cloud or a mechanical server. Using edge computing to harness and develop digital data speeds up processing time and reduces costs.

Hologram displays are in their crude evolutionary form today. That’s going to change soon, and holograms are part of the new, end-product “dimensional delivery”. By dimensional delivery, I mean the 4D technology where you’ll be able to watch a digitized hologram of your show. It will be like watching a completely realistic stage play, and you’ll have the option of joining in.

“Joining in” is a fascinating film delivery concept. In the future, algorithms will track your viewing habits/choices and will give you the option of personalizing your selection. You can make yourself into an avatar and can substitute your avatar for a cast member. On the international stage, you can change your race, gender, and language.

All this talk of high-density technology needs delivery infrastructure makeover. Internet providers today don’t have the speed or capacity to process and send out 5K resolution and totally digitized, virtual reality entertainment. But that’s changing, too, with 5G.

5G is the 5th generation wireless mobile network. It’s already happening and 6G is planned. To serve the metaverse, massively higher, multi-Gbps and ultra-low latency is crucial. The 5 and 6G networks will deliver the films of the future that today’s 4G system can’t.

One more film-world reality is money. Movies cost a lot of money to make. I’m told a show like Occam’s Razor typically budgets at around $50,000 per edited minute of film. Doing the math, a 60-minute episode would cost $3 million, give or take a fudge factor. So, a 10-episode season would cost the film’s financier around $30 million. To me, that’s a lot of coin—a lot of coin that can be saved through emerging technology.

Future technology will significantly reduce time and expenses in film making. Payment methods are changing, too. Blockchain will keep a digital trail and funds will commonly exchange in crypto currency. Non Fungible Tokens (NFTs) will probably be part of the package, though they’re going through a reevaluation at the moment.

I’m a newbie to the film industry, but everyone working in the business is a newbie to what’s coming at us from the future. My niche is making content—inventing and telling stories through characters, plots, and dialogues. But to make decent (meaning saleable) content, I must be aware of how the overall film production and delivery systems work. That’s what the past two years have been about.

City Of Danger seems to be saleable content. At least one film producer at a name-brand netstreamer thinks so. Realistically, the show is a few years away—2024 at the earliest—because the technology for what we want to portray isn’t perfected yet. Our plan is to screenwrite the 24 episodes (underway) and have it ready to be digitally produced in virtual reality by scanning the actors, turning them into avatars, and showing them as you see Susan Silverii who graces the promo poster. This should cut production costs to maybe half of today’s typical rates of filming a live actor and on-location series like Occam’s Razor.

Wish us luck. Or, as they say in theatrics, “Break a leg”.