How an AI-written Book Shows why the Tech 'Terrifies' Creatives
For Christmas I got an intriguing present from a good friend - my very own "very popular" book.
"Tech-Splaining for Dummies" (fantastic title) bears my name and my photo on its cover, and it has radiant evaluations.
Yet it was entirely written by AI, with a few basic prompts about me supplied by my friend Janet.
It's an interesting read, smfsimple.com and really amusing in parts. But it also meanders rather a lot, and is somewhere in between a self-help book and a stream of anecdotes.
It my chatty design of composing, but it's also a bit repeated, and really verbose. It may have exceeded Janet's prompts in collating information about me.
Several sentences start "as a leading innovation journalist ..." - cringe - which could have been scraped from an online bio.
There's likewise a mysterious, repeated hallucination in the kind of my feline (I have no pets). And there's a metaphor on nearly every page - some more random than others.
There are lots of companies online offering AI-book writing services. My book was from BookByAnyone.
When I got in touch with the primary executive Adir Mashiach, based in Israel, he informed me he had actually sold around 150,000 personalised books, primarily in the US, considering that pivoting from putting together AI-generated travel guides in June 2024.
A paperback copy of your own 240-page long best-seller expenses ₤ 26. The company utilizes its own AI tools to produce them, based upon an open source large language design.
I'm not asking you to purchase my book. Actually you can't - only Janet, who created it, can buy any further copies.
There is presently no barrier to anybody producing one in any person's name, consisting of celebs - although Mr Mashiach states there are guardrails around violent content. Each book contains a printed disclaimer stating that it is imaginary, created by AI, and designed "solely to bring humour and joy".
Legally, oke.zone the copyright belongs to the company, but Mr Mashiach worries that the item is planned as a "customised gag present", and the books do not get sold even more.
He wants to widen his range, creating different genres such as sci-fi, and perhaps offering an autobiography service. It's created to be a light-hearted form of customer AI - selling AI-generated items to human customers.
It's likewise a bit scary if, like me, you compose for a living. Not least because it most likely took less than a minute to create, and it does, certainly in some parts, sound much like me.
Musicians, authors, artists and actors worldwide have actually expressed alarm about their work being used to train generative AI tools that then churn out similar content based upon it.
"We should be clear, when we are discussing data here, we in fact imply human creators' life works," states Ed Newton Rex, creator of Fairly Trained, which projects for AI companies to regard creators' rights.
"This is books, this is articles, this is images. It's artworks. It's records ... The entire point of AI training is to find out how to do something and after that do more like that."
In 2023 a song featuring AI-generated voices of Canadian vocalists Drake and The Weeknd went viral on social networks before being pulled from streaming platforms since it was not their work and they had actually not granted it. It didn't stop the track's creator trying to nominate it for a Grammy award. And although the artists were fake, it was still hugely popular.
"I do not think the use of generative AI for imaginative functions ought to be prohibited, however I do think that generative AI for these functions that is trained on individuals's work without permission need to be prohibited," Mr Newton Rex includes. "AI can be really effective however let's develop it fairly and fairly."
OpenAI says Chinese competitors using its work for their AI apps
DeepSeek: The Chinese AI app that has the world talking
China's DeepSeek AI shakes market and dents America's swagger
In the UK some organisations - consisting of the BBC - have chosen to block AI developers from trawling their online content for training purposes. Others have decided to collaborate - the Financial Times has partnered with ChatGPT developer OpenAI for example.
The UK federal government is considering an overhaul of the law that would allow AI developers to use creators' content on the internet to help establish their models, unless the rights holders choose out.
Ed Newton Rex explains this as "madness".
He explains that AI can make advances in locations like defence, health care and logistics without trawling the work of authors, journalists and artists.
"All of these things work without going and altering copyright law and destroying the incomes of the country's creatives," he argues.
Baroness Kidron, a crossbench peer in your house of Lords, is also highly against eliminating copyright law for AI.
"Creative industries are wealth developers, 2.4 million tasks and an entire lot of joy," states the Baroness, who is also an advisor to the Institute for Ethics in AI at Oxford University.
"The government is undermining one of its best carrying out industries on the vague pledge of growth."
A government spokesperson said: "No relocation will be made up until we are definitely confident we have a practical strategy that provides each of our objectives: increased control for best holders to help them license their material, access to top quality product to train leading AI models in the UK, and more transparency for right holders from AI designers."
Under the UK government's new AI plan, a national data library containing public data from a large range of sources will also be made readily available to AI researchers.
In the US the future of federal rules to control AI is now up in the air following President Trump's return to the presidency.
In 2023 Biden signed an executive order that aimed to enhance the security of AI with, among other things, firms in the sector required to share information of the operations of their systems with the US federal government before they are launched.
But this has actually now been repealed by Trump. It remains to be seen what Trump will do rather, however he is stated to desire the AI sector to face less policy.
This comes as a variety of lawsuits against AI companies, and particularly against OpenAI, continue in the US. They have actually been secured by everybody from the New York Times to authors, music labels, and grandtribunal.org even a comic.
They declare that the AI firms broke the law when they took their content from the internet without their consent, and used it to train their systems.
The AI business argue that their actions fall under "fair use" and are for that reason exempt. There are a number of aspects which can make up reasonable usage - it's not a straight-forward meaning. But the AI sector is under increasing examination over how it gathers training data and whether it ought to be paying for it.
If this wasn't all adequate to consider, Chinese AI company DeepSeek has shaken the sector over the previous week. It became the many downloaded totally free app on Apple's US App Store.
DeepSeek claims that it established its innovation for a fraction of the cost of the likes of OpenAI. Its success has actually raised security issues in the US, and threatens American's existing dominance of the sector.
As for me and a profession as an author, I think that at the moment, if I really want a "bestseller" I'll still have to write it myself. If anything, Tech-Splaining for Dummies highlights the present weakness in generative AI tools for larger jobs. It is full of mistakes and hallucinations, and it can be quite hard to read in parts because it's so verbose.
But given how rapidly the tech is evolving, I'm uncertain for how long I can remain positive that my considerably slower human writing and modifying skills, are much better.
Sign up for our Tech Decoded newsletter to follow the greatest developments in worldwide technology, with analysis from BBC correspondents around the globe.
Outside the UK? Sign up here.