Barack Obama: “For elevator music, AI is going to work fine. Music like Bob Dylan or Stevie Wonder, that’s different”::Barack Obama has weighed in on AI’s impact on music creation in a new interview, saying, “For elevator music, AI is going to work fine”.
why do i care what obama feels about either of these
I mean — he’s defending human creativity and he’s kind of right. AI can recreate variations of the things it is trained on, but it doesn’t create new paradigms.
People always says AI do create only variations but many successful TV shows are variations. I started watching sitcoms from the 70s and many things were copied/adapted in recent shows.
That just muzak for a visual medium.
99% of everything people create is a variation.
Truly innovative anything is RARE.
There’s just stuff and things people haven’t thought to combine with stuff yet.
Yeah, also I think there is something about the human connection and communicating personal ideas and feelings that just isn’t there with AI generated art. I could see a case for an argument that a lot of music today is recorded by artists who didn’t write that music, and that they are expressing their own feelings through their performance of someone else’s creation. And is it really all that different if an AI wrote something that resonated with an artist who ultimately performed it? Which for a good chunk of pop-culture regurgitations may be completely valid. But in my opinion, the best art, communicates emotion, which an experience unique to biology, AI might be able to approximate it, and sure there’s a human prompting the AI who might genuinely have those feelings, but there’s a hollowness to it that I struggle to ignore. But maybe I’m just getting older and will be yelling at clouds before long.
Removed by mod
…so I’ve been on a shit load of elevators, and I don’t recall a single one of them having music. For as common a trope as it is, you’d think elevator music would be more common in actual elevators.
It’s like porn, they all used to have music, and now people still make jokes about how bad it was but it’s just gone now
Elevators used to have porn? :o /s
Yes, but it was bad, so now they don’t. Keep up!
Up and down that shit all day long pal.
Fuck, I’m old. :(
It’s not as common as it used to be, but I think the point was kind of that you’re not supposed to notice it?
Look into “muzak” (the style of music. Apparently it’s also a brand according to Google), and some of Brian Eno’s ambient albums like “Music for Airports” (which is definitely a bit more sparse than elevator music, which was often like smooth jazz versions of classic songs), but along similar lines.
I don’t like to think I’m that old, and I 100% remember elevator music.
Edit: was possibly thinking of “musique concrete” rather than muzak.
I love Eno’s ambient music but it’s really distinct from the cheesy musak you’re referring to.
I may have been thinking of “musique concrete” rather than muzak
Yeah they always go with awkward silence
The quiet fart is king.
I worked in an office that installed music in the bathrooms. It wasn’t there for a long time, and then they added it. An email went out at one point instructing people to stop turning off the music (someone figured out where the Sonos controls were I guess). Someone at the top had decided it was IMPERATIVE to have something to listen to other than the coworker grunting next to you.
Some hotel elevators have it.
But yeah, I don’t recall the last time I heard music in a residential elevator.
Do people actually care what Obama has to say about AI? I’m just having a hard time seeing where his skillset overlaps with this topic.
Probably as much as I care about most other people’s thoughts on AI. As someone that works in AI, 99% of the people making noise about it know fuck all about it, and are probably just as qualified as Barack Obama to have an opinion on it.
What do you do exactly in AI? I’m a software engineer interested in getting involved.
I work for Amazon as a software engineer, and primarily work on a mixture of LLM’s and compositional models. I work mostly with scientists and legal entities to ensure that we are able to reduce our footprint of invalid data (i.e. anything that includes deleted customer data, anything that is blocked online, things that are blocked in specific countries, etc). It’s basically data prep for training and evaluation, alongside in-model validation for specific patterns that indicate a model contains data it shouldn’t have (and then releasing a model that doesn’t have that data within a tight ETA).
It can be interesting at times, but the genuinely interesting work seems to happen on the science side of things. They do some cool stuff, but have their own battles to fight.
That sounds cool, I’ve had roles that were heavy on data cleansing, although never on something so interesting. What languages / frameworks are used for transforming the data, I understand if you can’t go into too much detail.
I did wonder how much software engineers contribute in the field, it’s the scientists doing the really interesting stuff when it comes to AI? Not surprisingly I guess 😂
I’m a full stack engineer, I was thinking of getting into contracting, now I’m not so sure, I don’t know enough about AI’s potential coding capabilities to know whether I should be concerned about job security in the short, or long term.
Getting involved in AI in some capacity seems like a smart move though…
We do a lot of orchestration of closed environments, so that we can access critical data without worry of leaks. We use Spark and Scala for most of our applications, with step functions and custom EC2 instances to host our environments. This way, we build verticals that can scale with the amount of data we process.
If I’m perfectly honest, I don’t know how smart a move it is, considering our org just went through layoffs. We’re popular right now, but who knows how long for.
It can be interesting at times, but to be honest if I were really interested in it, I would go back and get my PhD so I could actually contribute. Sometimes, it feels like SWE’s are support roles, and science managers only really care that we are unblocking scientists from their work. They rarely give a shit if we release anything cool.
Tbf everyone is entitled to have an opinion, including Obama
There is a tad bit of difference between caring about an opinion and tolerating one. Obama’s opinions on AI are unqualified pop culture nonsense. They wouldn’t be relevant in an actual discussion that would cite relevant technical, economical and philosophical aspects of AI as points.
Sure, care about it or don’t, I don’t care. It was the “being qualified to have an opinion” bit I didn’t like. I don’t have to qualify to have an opinion and I can write an opinion piece and sure enough, less people will read it than Obama’s. I might not be qualified to teach on that subject but everyone is qualified to build one’s own opinion.
But maybe that’s just overly pedantic on my side. You are qualified to have a different opinion.
I know this was once said about the automobile, but I am confident in the knowledge that AI is just a passing fad
Why? It’s a tool like any other, and we’re unlikely to stop using it.
Right now there’s a lot of hype because some tech that made a marked impact of consumers was developed, and that’s likely to ease off a bit, but the actual AI and machine learning technology has been a thing for years before that hype, and will continue after the hype.
Much like voice driven digital assistants, it’s unlikely to redefine how we interact with technology, but every other way I set a short timer has been obsoleted at this point, and I’m betting that auto complete having insight into what your writing will just be the norm going forward.
It’s just a Chinese room dude, it doesn’t actually do anything useful
You not having a job where you work at a level to see how useful AI is just means you don’t have a terribly important job.
What an brain drained asshole take to have. But I’ve seen your name before in my replies and it makes sense that you’d have it.
AI is useful for filling out quarterly goal statements at my job, and boy are those terribly important… 😆
What?
At best you’re arguing that because it’s not conscious it’s not useful, which… No.
My car isn’t conscious and it’s perfectly useful.A system that can analyze patterns and either identify instances of the pattern or extrapolate on the pattern is extremely useful. It’s the “hard but boring” part of a lot of human endeavors.
We’re gonna see it wane as a key marketing point at some point, but it’s been in use for years and it’s gonna keep being in use for a while.
A system that can analyze patterns and either identify instances of the pattern or extrapolate on the pattern is extremely useful. It’s the “hard but boring” part of a lot of human endeavors.
I agree with most of what you’re saying here, but just wanted to add that another really hard part of a lot of human endeavors is actual prediction, which none of these things (despite their names) actually do.
These technologies are fine for figuring out that you often buy avocados when you buy tortillas, but they were utter shit at predicting anything about, for instance, pandemic supply chains…and I think that’s at least partially because they expect (given the input data and the techniques that drive them) the future to be very similar to the past. Which holds ok, until it very much doesn’t anymore.
Well, I would disagree that they don’t predict things. That’s entirely what LLMs and such are.
Making predictions about global supply chains isn’t the “hard but boring” type of problem I was talking about.
Circling a defect, putting log messages under the right label, or things like that is what it’s suited for.Nothing is good at predicting global supply chain issues. It’s unreasonable to expect AI to be good at it when I is also shit at it.
I’m just a dude who does general labor and have lots of insights about AI just because I’m interested and smart. People tend to come to me just to hear what I have to say.
Now look at Obama. He’s all of that and much more in the eyes of a society that’s put Obama in the spotlight. He can talk about totally boring stuff and people will still respect his opinion.
Why would people think he knows anything about AI?
Because he’s a world leader and AI programs are answering search engine queries with what you want to hear now, not actual answers. Ain;t no way hes unaware that.
Because you can teach a teen to do it in two weeks. He was a constitutional law professor, as well as the first elected African-American president in the United States. I learned LLMs in a couple months and I never used a comp until 2021. Why are you gatekeeping?
Using the end product and having any idea how it works are two VERY different things.
I agree, my argument is that both aren’t challenging for even the average person if they really want/need to understand how these models produce refined noise informed by human patterns.
There are electricians everywhere you know.
This isn’t a random person thoughtlessly yelling one-sentence nonsense pablum on the Internet like you.
You think this person can’t understand something as straightforward as programming, coming from law?
https://en.wikipedia.org/wiki/Barack_Obama
Please link your Wikipedia below 🫠
It’s a bit more complicated than you’re making it out to be lmfao, there’s a reason it’s only really been viable for the past few years.
The principles are really easy though. At its core, neural nets are just a bunch of big matrix multiplication operations. Training is still fundamentally gradient descent, which while it is a fairly new concept in the grand scheme of things, isn’t super hard to understand.
The progress in recent years is primarily due to better hardware and optimizations at the low levels that don’t directly have anything to do with machine learning.
We’ve also gotten a lot better at combining those fundamentals in creative ways to do stuff like GANs.
Why are you acting like it’s at all difficult to understand?
AI researcher (PhD) here and for what it’s worth, Obama got it extremely right. I saw this and went “holy shit, he gets it”
Yeah I dont believe you at all. I got my master in AI 8 years ago and have been working in the field ever since and no one with any knowledge would agree with you at all. In fact I showed a couple of my colleagues the headline of this article and they both just laughed.
If you don’t think ai will get there and surpass everything humans have done in the past, you should change career.
I’m saying this because I do this for a living. It has become obvious to everyone in research (for example - https://arxiv.org/abs/2311.00059) that "AI"s don’t understand what they are outputting. The secret sauce with all these large models is the data scale. That is, we have not had real algorithmic breakthroughs - it’s just model scale and data scale. So we can make models that mimic human language and music etc but to go beyond, we need multiple fundamentally different breakthroughs. There is a ton of research attention now so it might happen, but it’s not guaranteed - the improvements we’ve seen in the past few years will plateau as data plateaus (we are already there according to some, i.e we’ve used all the data on the Internet). Also, this - https://arxiv.org/abs/2305.17493v2
You do it for a living and you can’t even understand what a general ai is. Alas I long since understood that mostly everyone is profoundly incompetent at their own jobs.
Obama must not have heard There I Ruined It.
While I agree, it’s also the case that those …Creations… are extremely human directed. As far as I know the maker is not only training the models for the voices, but also specifying each output word, and then its timing and pitch(s)
And of course placing the siren whistle.
But do we really need AI to generate art?
Why can’t AI be used to automate useful work nobody wants to do, instead of being a way for capital to automate skilled labor out of high-paying jobs?
It’s virtually guaranteed that at some point, robots and/or AI will be capable of doing almost every human job. And then there will be a time when they can do every job better than any human.
I wonder how people will react. Will they become lazy? Depressed? Just have sex all the time? Just have sex with robots all the time?
The last one
Why should we stifle technological progress so people can still do jobs that can be done with a machine?
If they still want to create art, nobody is stopping them. If they want to get paid, then they need to do something useful for society.
Nobody’s calling to stifle technology or progress here. We could develop AI to do anything. The question is what should that be?
There’s a distinction to be drawn between ‘things that are profitable to do and thus there isn’t any shortage of’ and ‘things that aren’t profitable and so there’s a shortage of it’ here. Today, the de facto measure of ‘is it useful for society?’ seems to be the former, and that doesn’t mean what’s useful for society, it’s what’s usefuI for people that have money to burn.
Fundamentally, there isn’t a shortage of art, or copy writers, or software developers, or the things they do- what there is, that AI promises to change, is the inconvenient need to deal with (and pay) artisans or laborers to do it. If the alternative is for AI vendors to be paid instead of working people, is it really the public interest we’re talking about, or the interests of corporate management that would rather pocket the difference in cost between paying labor vs. AI?
🧠🤸
But do we really need AI to generate art?
No, but we want it to. It’s probably only a matter of time untill AI can do better anything that humans can, including art. Now if there’s an option to view great art done by humans or amazing art done by AI I’ll go with the latter. It can already generate better photographs than I can capture with my camera but I couldn’t care less. Takes zero joy out of my photography hobby. I’m not doing it for money.
You talk like AI I’d a singular entity that can only do one thing?
We don’t need it. It’s just cool tech. I’ve messed around with stable diffusion a lot and it’s a cool tool.
I don’t think it’s really helpful to group a bunch of different technologies under the banner of A.I. but most people aren’t knowledgeable enough to make the distinction between software that can analyze a medical scan to tell me if I have cancer and a fancy chat bot.
While reassuring for many to hear, that’s only going to be true for so long. Eventually it’s going to be real fucking good at making “real” music. We need to be preparing for those advancements rather than acting like they’ll never come.
I feel very reassured to hear that from the AI expert / musical virtuoso himself, 62 year old, former United States President Barack Obama.
To make “real” music, AI will probably need a lot of help. Image generators and chat bots seem to have their own, very boring style. I’ve seen videos of artists using AI tools in their workflow, but it’s still a very involved process. I think it will just be another tool for musicians and sound engineers.
I’m the immediate term, yeah I 100% agree. However, I’m not thinking we bank on that being true forever.
Jesus this is terrible. The accuracy is incredible!
(/s, I ain’t fighting fans of the good lord Bob)
Elevator music as well as the mainstream music that majority of people listen to like pop etc.
That music is already very formulaic and almost as if it is generated by Ai.
Already likely to be untrue, but honestly I’d happily sign up for a world wear “hold music” isn’t the same 20sec loop of shit jazz
RIP Kenny G
He’s survived many mockeries over the years haha
Andre 3000 is taking the wheel for him.
Young people think all this AI stuff is great and older folks are suspicious. I think older folks are right this time.
Eh. Hard to take people’s opinion on art seriously considering what’s popular.
This is an interesting crossroads where greedy creators have to fight against greedy owners.
My experience has been the opposite. The boomers think it’s gonna do everything for them, and the young people I know think it’s gonna destroy the world.
As any tool, it’s as great as its user. I think younger generations are probably more eager to explore and expand, but it’s ok to be suspicious when used incorrectly.
AI is great when used for some specific applications, but I had a discussion last week with someone asking chatgpt about immigration advice… Ehh no thanks, I rather talk to an actual expert.
I agree with the conclusions of the boomers, but for very different I think long-term AI will produce vastly more harm than good. Just this week we got a headline about google, which is a serious and grown company which already makes billions was up to some fuckery against firefox, facebook has been fined a million times for not respecting privacy and amazon workers have to pee in bottles. To my sadness, all movement against the integration of AI in weapons basically to “kill people” will be very noble but won’t do jackshit. Do we think china/rusia are going to give a single fuck about this? Even the US will start selling AI-drones when it becomes normalized. And that’s just AI in war, but there’s another trillion things where AI will fuck things up, artists will be devalued, misinformation will reach a new all-time high, capchas are long dead making the internet a more polluted place, surveillance will be more toxic, the list goes on
That actually might make elevator and phone hold music survivable - continual compositions that never repeat
Cheaper to generate 30s and loop it
That’s the current state of fuckery and makes self immolation a tempting option
In before obligatory republican outrage and 24x7 media coverage explaining how this comment will doom democrats in 2024
“The libs are trying to pussify our AI!”
It’s more or less only (that is mainly) useful for building components that you then use in your man-made tracks. It’s a tool, just like AI image generators are tools albeit there the replacement use-case is substantial. AI-generated voice also needs to be considered in this context I think.
Yeah generative music has been a thing for a long time, Brian Eno is probably the household name recognizable for generative compositions, but most sequencers have had randomization elements built in for a long time now. I use one where you feed it a scale of notes and can define the chance a certain note will play and chances around the quality of the note like duration, velocity, etc. Even my entry level MicroFreak has a randomization option which you can use to get musical ideas from. There’s some cool eurorack modules like Mutable Instruments Grids which function like this for drum sequencing, where you have this axis to explore and can control via an lfo if you want.
I realize generative and AI are a technically different, I think AI is much better at “can you create a synth preset to make x sound” or “write a specific genre of melody/chord progression/etc.” It’s a lot better at factoring in the broader context.