AI is going to destroy art the same way Photoshop, or photography, or pre-made tubes of paints, destroyed art. It’s a tool, it helps people take the idea in their head and put it in the world. And it lowers the barrier to entry, now you don’t need years of practice in drawing technique to bring your ideas to life, you just need ideas.
If AI gets to a point that it can give us creative, original, art that sparks emotion in novel ways…well we probably also made a super intelligent AI and our list of problems is much different than today.
i like the idea of AI as a tool artists can use, but that’s not a capitalist’s viewpoint, unfortunately. they will try to replace people.
And if text-based images remain uninspired and samey… oh well? Congratulations, you will foreverafter be able to spot when someone’s extremely timely gag image was cranked out via its description, rather than badly composited from Google Images results. I’ve done a lot of bad compositing for Something Awful shitpost threads and speed beats effort every time.
Tbh I hate Photoshop for a lot of photography. It is unfortunately necessary for macro photography, which is the only type I do. Which is one of the reasons mine is not nearly as good as it could be because I refuse to use it.
This. AI was never made for the sole purpose of creating art or beating humans in chess. Doing so are just side quests for the real stuff.
Some people also doesn’t care if there is a Rembrandt or a Picasso or an AI but like to dabble in the arts anyways because it’s something they like to do.
It’s fulfilling (I do love Renoir though).
I hate this sentiment. It’s not a tool like a brush is to a canvas. It’s a machine that runs off the fuel of our creative achievements. The sheer amount of pro AI shit I read from this place just makes me that closer to putting a bullet in my fucking skull
Once you reincarnate in the future, generative models will make even better art than they do today. It’ll be a losing battle against time.
Shill
Luddite
Tech bros are not really techie themselves as they are really just Wall Street bros with tech as their product. Most claim they can code, but if they were coders they would be coding. They are not coders, they are businessmen through and through.who just happen to sell tech.
99% of people in tech leadership are just regurgitating marketing jargon with minimal understanding of the underlying tech.
I think approximation is the right word here. It’s pretty cool and all and I’m looking forward how it will develop. But it’s mostly a fun toy.
I’m stoked for the moment the tech bros understand, that an AI is way better at doing their job than it is at creating art.
I think one thing you and many other people misunderstand is that the image generation aspect of AI is a sideshow, both in use and in intent.
The ability to generate images from text based prompts is basically a side effect of the ability that they are actually spending billions on, which is object detection.
I work in AI. LLM’s are cool and all, but I think it’s all mostly hype at this stage. While some jobs will be lost (voice work, content creation) my true belief is that we’ll see two increases:
-
The release of productivity tools that use LLM’s to help automate or guide menial tasks.
-
The failure of businesses that try to replicate skilled labour using AI.
In order to stop point two, I would love to see people and lawmakers really crack down on AI replacing jobs, and regulating the process of replacing job roles with AI until they can sufficiently replace a person. If, for example, someone cracks self-driving vehicles then it should be the responsibility of owning companies and the government to provide training and compensation to allow everyone being “replaced” to find new work. This isn’t just to stop people from suffering, but to stop the idiot companies that’ll sack their entire HR department, automate it via AI, and then get sued into oblivion because it discriminated against someone.
I’ve also heard it’s true that as far as we can figure, we’ve basically reached the limit on certain aspects of LLMs already. Basically, LLMs need a FUCK ton of data to be good. And we’ve already pumped them full of the entire internet so all we can do now is marginally improve these algorithms that we barely understand how they work. Think about that, the entire Internet isnt enough to successfully train LLMs.
LLMs have taken some jobs already (like audio transcription, basic copyediting, and aspects of programming), we’re just waiting for the industries to catch up. But we’ll need to wait for a paradigm shift before they start producing pictures and books or doing complex technical jobs with few enough hallucinations that we can successfully replace people.
My own personal belief is very close to what you’ve said. It’s a technology that isn’t new, but had been assumed to not be as good as compositional models because it would cost a fuck-ton to build and would result in dangerous hallucinations. It turns out that both are still true, but people don’t particularly care. I also believe that one of the reasons why ChatGPT has performed so well compared to other LLM initiatives is because there is a huge amount of stolen data that would get OpenAI in a LOT of trouble.
IMO, the real breakthroughs will be in academia. Now that LLM’s are popular again, we’ll see more research into how they can be better utilised.
Afaik open ai got their training data from basically a free resource that they just had to request access to. They didn’t think much about it along with everyone else. No one could have predicted that it would be that valuable until after the fact where in retrospect it seems obvious.
The (really, really, really) big problem with the internet is that so much of it is garbage data. The number of false and misleading claims spread endlessly on the internet is huge. To rule those beliefs out of the data set, you need something that can grasp the nuances of published, peer-reviewed data that is deliberately misleading propaganda, and fringe conspiracy nuts that believe the Earth is controlled by lizards with planes, and only a spritz bottle full of vinegar can defeat them, and everything in between.
There is no person, book, journal, website, newspaper, university, or government that has reliably produced good, consistent help on questions of science, religion, popular lies, unpopular truths, programming, human behavior, economic models, and many, many other things that continuously have an influence on our understanding of the world.
We can’t build an LLM that won’t consistently be wrong until we can stop being consistently wrong.
Yeah I’ve heard medical LLMs are promising when they’ve been trained exclusively on medical texts. Same with the ai that’s been trained exclusively on DNA etc.
Nah fuck HR, they’re the shield of the companies to discriminate withing margins from behind
I think the proper route is a labor replacement tax to fund retraining and replacement pensions
I sincerely doubt AI voice over will out perform human actors in the next 100 years in any metric, including cost or time savings.
Not sure why you’re downvoted, but this is already happening. There was a story a few days ago of a long-time BBC voice-over artist that lost their gig. There have also been several stories of VA workers being handed contracts that allow the reuse of their voice for AI purposes.
The artist you’re referring to is Sara Poyzer - https://m.imdb.com/name/nm1528342/ - she was replaced in one specific way:
The BBC is making a documentary about someone (as yet unknown), who is dying and has lost the ability to speak. Poyzer was on pencil (like standby, hold the date - but not confirmed).to narrate the dying person’s words. Instead they contracted an AI agency to use AI to mimic the dying persons voice (from when they could still speak).
It would likely be cheaper and easier to hire an impressionist, or Ms Poyzer herself but I assume they are doing it for the “novelty” value, and with the blessing of the terminally ill person.
For that reason I think my point still stands, they have made the work harder and more expensive, and created a negative PR storm - all problems created by AI and not solved by.
You are incorrect that AI voice contracts are common place, as SAG negotiated that use of AI voice tools is to be compensated as if the actor recorded the lines themselves - which most actors do from home nowadays, so again it’s at best the same cost for an inferior product - but actually more expensive because you were paying just the actor, but now you’re paying the actor AND the AI techs.
edit: and not just that, AI voice products are bad. Yes, you can maybe fudge the uncanny Valley a bit by sculpting the prompts and the script to edge towards short sentences, delivered in a monotone, narrating an emotionless description without caring about stress patterns or emphasis, meter, inflection or caesura, and without any breathing sounds (sometimes a positive sometimes a negative) - but that’s all in an actors wheelhouse for free.
UBI is better and has more momentum with the general public
Are you saying that if a company adopts AI to replace a job, they should have to help the replaced workers find new work? Sounds like something one can loophole by cutting the department for totally unrelated reasons before coincidentally realizing that they can have AI do that work, which they totally didn’t think of before firing people.
That’s why it would need regulation to work…
-
Art itself isn’t useless it’s just incredibly replicable. There is so much good art out there that people don’t need to consume crap.
It’s like saying there is no money in being a footballer. Of course there is loads of money in being a footballer. But most people that play football don’t make any money.
Billions were spent inventing and producing the calculator device.
Human calculators are now extinct.
Complex calculations are far more accessible.
This has a secondary effect of making average people incapable of estimation in their heads. Hopefully in the future people won’t be incapable of writing and art.
Average people weren’t doing complex math in their head back when human calculators were a thing.
But they were estimating things. Somehow illiterate people ran marketplaces for thousands of years.
Turing Incompleteness is a pathway to many powers the Computer Scientists would consider incalculable.
In fact, there’s infinite problems that cannot be solved by Turing machnes!
(There are countably many Turing-computable problems and uncountably many non-Turing-computable problems)
That’s a pretty shit take. Humankind spent nearly 12 thousand years figuring out the combustion engine. It took 1 million years to figure farming. Compared to that, less than 500 years to create general intelligence will be a blip in time.
i think you’re missing the point, which i took as this - what arts and humanities folks do is valuable (as evidenced by efforts to recreate it) despite common narratives to the contrary.
Of course it’s valuable. So is, e.g., soldering components on a circuit board, but we have robots for doing that at scale now.
Do you think robots will ever become better than humans at creating art, in the same way they’ve become better than us at soldering?
Not if climate change drives humans extinct before they can make those improvements
feel free to audit my comments to confirm my distinct lack of gpt enthusiasm but that question is unanswerable.
What is “creating art”? A distinctly human thing? then trivially no. Idk how many people go with this interpretation though. Although I think many artists and art appreciators do at least some of the time.
Is it drawing pretty pictures? Probably too reductive for even the most hardline tech enthusiasts but computers are already very good at this. If I want to say get my face in something that looks like an old timey oil painting computers are way faster than humans.
Is it making things that make us feel something? They can probably get pretty good at this. Although it’s unclear how novel the results will be most people aren’t exposed to most art so you could probably produce novel feelings on an individual level pretty well.
Art is so fuzzy and used with such a range of definitions it’s not really clear what this is asking.
Even if they’re better the future might still suck. Machines are technically better at all the components of carpentry than humans but I’d rather furniture wasn’t souless minimalist MDF landfill garbage and carpenters could still earn a living. Even if that means my chairs were a bit uneven.
Yep.
Quite easily, yes. Unlike humans, with their limited lifespans and slow minds, Artificial Inteligence could create hundreds of different paintings in the time it’d take me to finish one.
Being able to put out lots of works isn’t the same as being able to come up with good, meaningful art?
That depends on things we don’t know yet. If it can be brute forced (throw loads of computation power, gazillions of try & error, petabytes of data including human opinions), then yes, “lots of work” can be an equivalent.
If it does not, we have a mystery to solve. Where does this magic come from? It cannot be broken down into data and algorithms, but still emerges in the material world? How? And what is it, if not dependent on knowledge stored in matter?
On the other hand, how do humans come up with good, meaningful art?
TalentPractice. Isn’t that just another equivalent of “lots of work”? This magic depends on many learned data points and acquired algorithms, executed by human brains.There also is survivor bias. Millions of people practice art, but only a tiny fraction is recognized as artists (if you ask the magazines and wallets). Would we apply the same measure to computer generated art, or would we expect them to shine in every instance?
As “good, meaningful art” still lacks a good, meaningful definition, I can see humans moving the goalpost as technology progresses, so that it always remains a human domain. We just like to feel special and have a hard time accepting humiliations like being pushed out of the center of the solar system, or placed on one random planet among billion others, or being just one of many animal species.
Or maybe we are unique in this case. We’ll probably be wiser in a few decades.
What does it even mean to bruteforce creating art? Trying all the possible prompts to some image model?
The approach people take to learning or applying a skill like painting is not bruteforcing, there is actual structure and method to it.
Really only around 80 years between the first machines we’d consider computers and today’s LLMs, so I’d say that’s pretty damn impressive
That’s why the sophon was sent to disrupt our progress. Smh
Llm’s are not a step to agi. Full stop. Lovelace called this like 200 years ago. Turing and minsky called it in the 40s.
Pray tell, when did we achieve AGI so that you can say this with such conviction? Oh, wait, we didn’t - therefore the path there is still unknown.
Okay, this is no more a step to AGI than the publication of ‘blindsight’ or me adding tamarind paste to sweeten my tea.
The project isn’t finished, but we know basic stuff. And yeah, sometimes history is weird, sometimes the enlightenment happens because of oblivious assholes having bad opinions about butter and some dude named ‘le rat’ humiliating some assholes in debates.
But llm’s are not a step to AGI. They’re just not. They do nothing intelligence does that we couldn’t already do. Youre doing pareidola. Projecting shit.
When the Jewish made their first mud golem ages ago?
Humanity didn’t spend those times figuring out those things though. Humanity grew that time to make it happen (and AI is younger than 500y IMO).
Also, we are the same persons today than people were then. We just have access to what our parents generation made and so on.
AI is younger than 500y IMO
Hence “will be a blip in time”
we are the same persons today than people were then. We just have access to what our parents generation made and so on.
Completelly disconnected and irrelevant to anything I wrote.
less than 500 years to create general intelligence will be a blip in time.
You jinxed it. We aren’t gonna be around for 500 years now are we?
This is some pretty weird and lowkey racist exposition on humanity.
Humankind isn’t a single unified thing. Individual cultures have their own modes of subsistence and transportation that are unique to specific cultural needs.
It’s not that it took 1 million years to “figure out” farming. It’s that 1 specific culture of modern humans (biologically, humans as we conceive of ourselves today have existed for about 200,000 years, with close relatives existing for in the ballpark of 1M years) started practicing a specific mode of subsistence around 23,000 years ago. Specific groups of indigenous cultures remaining today still don’t practice agriculture, because it’s not actually advantageous in many ways – stored foods are less nutritious, agriculture requires a fairly sedentary existence, it takes a shit load of time to cultivate and grow food (especially when compared to foraging and hunting), which leads to less leisure time.
Also where did you come up with the number 12,000 for “figuring out” the combustion engine? Genuinely curious. Like were we “working on it” for 12k years? I don’t get it. But this isn’t exactly a net positive and has come with some pretty disastrous consequences. I say this because you’re proposing a linear path for “humanity” forward, when the reality is that humans are many things, and progress viewed in this way has a tendency toward racism or at least ethnocentrism.
But also yeah, the point of this meme is “artists are valuable.”
This is some pretty weird and lowkey racist exposition on humanity.
Getting “racism” from that post is a REAL stretch. It’s not even weird, agriculture and mechanization are widely considered good things for humanity as a whole
Humankind isn’t a single unified thing. Individual cultures have their own modes of subsistence and transportation that are unique to specific cultural needs.
ANY group of humans beyond the individual is purely just a social construct and classing humans into a single group is no less sensible than grouping people by culture, family, tribe, country etc.
It’s not that it took 1 million years to “figure out” farming. It’s that 1 specific culture of modern humans (biologically, humans as we conceive of ourselves today have existed for about 200,000 years, with close relatives existing for in the ballpark of 1M years) started practicing a specific mode of subsistence around 23,000 years ago. Specific groups of indigenous cultures remaining today still don’t practice agriculture, because it’s not actually advantageous in many ways – stored foods are less nutritious, agriculture requires a fairly sedentary existence, it takes a shit load of time to cultivate and grow food (especially when compared to foraging and hunting), which leads to less leisure time.
Agriculture is certainly more efficient in terms of nutrition production for a given calorie cost. It’s also much more reliable. Arguing against agriculture as a good thing for humanity as a whole is the thing that’s weird.
I’m really not “arguing against agriculture,” I’m pointing out that there are other modes of subsistence that humans still practice, and that that’s perfectly valid. There are legitimate reasons why a culture would collectively reject agriculture.
But in point of fact, agriculture is not actually more efficient or reliable. Agriculture does allow for centralized city states in a way that foraging/hunting/fishing usually doesn’t, with a notable exception of many indigenous groups on the western coast of turtle island.
A study positing that in fact, agriculturalists are not more productive and in fact are more prone to famine: https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3917328/
But the main point I was trying to make is that different expressions of human culture still exist, and not all cultures have followed along the trajectory of the dominant culture. People tend to view colonialism, expansion and everything that means as inevitable, and I think that’s a pretty big problem.
This kind of thinking is dangerous and will hinder planetary unification…
All I’m trying to point out is that distinct cultures are worthy of respect and shouldn’t be glossed over.
But be real with me: can you think of a single effort for “planetary unification” that wasn’t a total nightmare? I sure can’t.
This attitude is what prevents us from unifying…smh
I just love the idjits who think not showing empathy to people AI bros are trying to put out of work will save them when the algorithms come for their jobs next
When LeopardsEatingFaces becomes your economic philosophy
Matthew Dow Smith, whomever the fuck that is, has a sophisticated delusion about what’s actually going on and he’s incorporated it into his persecution complex. Not impressed.
Honestly people are trying to desperately to automate physical labor to. The problem is the machines don’t understand the context of their work which can cause problems. All the work of AI is a result of trying to make a machine that can. The art and humanities is more a side project
The art and humanities is more a side project
I’ll add:
A side project that isn’t a life or death situation like most of those physical labor things you’re talking about. Art isn’t also bound or constrain by rules and regulations like those jobs and if the AI fails at art then there’s no problem. Nobody would care.
Besides, if it fails at art it might even create something we never thought
this is fundamentally the opposite of what generative AI does. its fail state is basically regurgitating its training data intact.
So… art is essentially failing ahaha.
Yeah pretty much. I think. I am no art connoisseur though. If I see pretty drawings or images or whatever, I like.
if the AI fails at art then there’s no problem. Nobody would care.
Nothing wrong in automating tasks that previously needed human labour. I would much rather sit back and chill, and let automation do my bidding
If only the people in control of the wealth would let the rest of us chill while the machines do all the labor.
Yeah if only I didn’t have to farm food all day, and worry about the constant gnawing of my empty stomach, and the predators at my door, then I could maybe sit and watch some netflix or play video games, listen to concerts that took place fifty years ago, or just soak in a hot tub of water, our horrible society keeps all that leisure for the most wealthy.
that’s a social problem, not technology’s fault.
It’s a psychological problem. I chill quite a bit more than most people in history, and in ways people from twenty years ago couldn’t imagine.
I say it’s a psychological problem because despite how overwhelmingly incredible our society is, people are totally committed to this notion that it sucks.
I love my life. I’d rather be low on the economic ladder in today’s world than anywhere in the hierarchy of any previous incarnation of our civilization. Our world is absolutely fucking amazing, and I thank god I have the presence of mind to see past the anti-everything propaganda and actually have a little gratitude for all I’ve inherited from my ancestors, who actually suffered miserable conditions to give me this world.
I believe that i read a title in my local news about AI being implemented in this country’s tax system and evaluation of cancer patients. I could try to find a link although it would be in a different language.
they’re misunderstanding the reasoning for spending billions.
the reason to spend all the money to approximate is so we can remove arts and humanities majors altogether… after enough approximation yield similar results to present day chess programs which regularly now beat humans and grand masters. their vocation is doomed to the niche, like most of humanity, eventually.
Removed by mod
It’s not this guy’s fault your vocation is doomed
If you think arts and humanities are useless, you probably lack an imagination.
Like completely.
I won’t say you’re useless, because simple minded grunts are needed.
Humanity wouldn’t exist without the arts.
Ah yes “the arts”. Definitely the point of humanities, and nothing to do with categorizing the world into “important people” and “simple minded grunts”.
Humanities students don’t read these days, and it shows.
“Art” as a term is so all-encompassing that it’s hard to define what is and isn’t art.
I’m sure you can rustle up some very reductive few word definition, but the most popular ones go something like “the expression or application of human creative skill and imagination”, and that’s a very broad definition, wouldn’t you agree?
I’m sure you’d also agree there just are some people who never seem to express or apply any of their creative skill or imagination (and some who genuinely seem to lack any altogether), despite still being productive members or society.
Not everyone needs to be an artist, a minority of the population will do, but without artists, we would all perish. As those people who don’t necessarily express or apply creative skill or imagination, still most certainly enjoy it, and probably couldn’t get through their jobs without it. (Repetitive work is just so much easier while listening to music, and I’m sure that’s not a controversial statement.)
So what do humanities students do these days then, according to you, since they “don’t read”?
I’d love to see some data on the people who believe that AI fundamentally can’t do art and the people who believe that AI is an existential threat to artists.
Anecdotally, there seems to be a large overlap between the adherents of what seem to be mutually exclusive positions and I wish I understood that better.
People used to pay lots of money to digital artists for various tasks. Now generative models like stable diffusion can do many of those things, just as graphic design. This is resulting in people paying less to artists.
I get that and there are a lot of jobs that people used to pay for and no longer do.
The entire horse industry has mostly collapsed. I couldn’t get a job as scribe. With any luck, all the industries around fossil fuel will go away. We’re going to pay less to most people in those industries too.
Well yes, since the economy is in shambles, us normal people will try to spend as little money as possible to make sure we are safe
Chill, tech bros are spending billions to oust every unmarketable degree and skillset.
Also unmarketable ≠ “useless”