Users will flood back in the next few week when school comes back. I’d like to see another breakdown in December.
Higher ed, primary ed, and homework were all subcategories ChatGPT classified sessions into, and together, these make up ~10% of all use cases. That’s not enough to account for the ~29% decline in traffic from April/May to July, and thus, I think we can put a nail in the coffin of Theory B.
It’s addressed in the article. First, use started to decline in April, before school was out. Second, only 23 percent of prompts were related to education, which includes both homework type prompts, and personal/professional knowledge seeking. Only about 10 percent was strictly homework. So school work isn’t a huge slice of ChatGPTs use.
Combine that with schools cracking down on kids using ChatGPT (in classroom assignments and tests, etc), and I don’t think your going to see a major bounce back in traffic when school starts. Maybe a little.
I’m starting to think generative AI might be a bit of a fad. Personally I was very excited about it and used ChatGPT, Bing, and Bard all the time. But over time I realized they just weren’t very good, inaccurate answers, bland writing, just not much help to me, a non programmer. I still use them, but now it’s maybe once a day or less, not all day like I used to. Generative AI seems more like a tool that is helpful in some limited cases, not the major transformation it felt like early in the year. Who knows, maybe they’ll get better and more useful.
Also, not super related, but I saw a static the other day that only about a third of the US has even tried ChatGPT. It feels like a huge thing to us tech nerdy people, but your average person hasn’t bothered to even try it out.
People had a huge surge of interest in it at first because they wanted to know what it was about, it was fascinating and exciting especially exploring it’s limits and playing around - it was fun spending hours just messing with it. The numbers are bound to drop as the novelty wears off, the amount of people actually using it to get stuff done might still be going up even if numbers fall by half.
Certainly some teachers fear new technology but most realize that they have to teach the kids to live in the world the kids will be growing up into not the would the teacher grew up into. Teachers will be educating kids on how to use it as a research tool, on how to use it in projects, in how to use it to improve writing quality - we will be eventually see more kids getting marked down with ‘chatGPT could have helped reword this’ than we see ‘this is written too well you must have used modern tools to help’
Not that I have any faith in teachers being sensible, when I was at school they wouldn’t accept typed homework on the premise ‘when you get a job your boss isn’t going to allow you to type up your work’ and I’m only talking about the mid 90’s here lol
Really though I think most people are going to be interacting with LLMs via tools built into other things - it’ll be one of those things we only really notice when it’s annoying. Daily use will be things like being able to refine searches when online shopping ‘I need a plug for my bath’ returning a selection of bath plugs rather than electrical connectors, music by the bang plug, and pluggano pasta. Especially when it can show a selection then refine it by saying ‘like that but in pink’ or even ask it ‘what’s the difference between these two’ and it says ‘this is three dollars and made from a softer material for a more effective seal’
Oof. I’ve tried it with a few Powershell things and it has recommended cmdlets that don’t exist, parameters that don’t exist, or the wrong usage of cmdlets.
It’s really limited to basic, junior level programming assistance, and even then it’s not 100% reliable. Any time I’ve tried asking it something more advanced it takes a lot of coaxing to get it to output reasonable code. But it’s helpful for boilerplating basic code sometimes.
Have you tried 3.5 or 4?
I haven’t had many issues in 4. Occasionally it does what you’re saying and I just say “bro, that doesn’t exist” and it’s like “oh, my bad, here you go.” And gives me something that works.
I used gpt4 for terraform and it was kind of all over the place in terms of fully deprecated methods. It felt like a nice jumping off point but honestly probably would’ve been less work to just write it up from the docs in the first place.
I can definitely see how it could help someone fumble through it and come up with something working without knowing what to look for though.
Was also having weird issues with it truncating outputs and needing to split it, but even telling it to split would cause it to kind of stall.
I don’t remember what version. I just gave up trying
Well don’t expect it to just give magical results without learning prompt engineering and understanding the tools you’re working with.
Set-MailboxAddressBook doesn’t exist.
Set-ADAttribute doesn’t exist.
Asking for a simple command and expecting to receive something that actually exists is magical?
I love using it while programming but I almost never use it besides that. Not even sure what I would use it for besides that on a day to day basis.
Sometimes I use it for laughs. I usually use it as a search engine one steroids when I cant find the answer to a problem. Not having data past 2021 is a huge limiting factor for real producity though.
I know it’s not popular, but Bing Chat works surprisingly well if you need a GPT response that can hit the Internet. It’s not perfect but anytime I need current information I generally use it and it’s worked pretty well for me!
I also use it for programming and today is the first day that I experienced the degradation that everyone has been talking about. It was spitting out the same code over and over, saying it was changing it, and then it slowed to a crawl and barely responded. Most of its answers were wrong and unhelpful. I have really enjoyed using it instead of stackoverflow for a few months now, so I hope this isn’t something that’s going to continue.
As for other uses, my wife and I used it to find a movie to watch a few days ago. We described the type of movie we wanted to see and asked it to recommend 10. We picked one and it was exactly what we wanted to watch. That was really neat.
I used to to write a resume and cover letter for me, which I then punched up. I figure that since companies are using AI to review resumes, I should use one to write one.
Recently, I used it for book/Author recommendations. At first I also used it for coding, but now I just ask it to explain concepts to me (what’s the difference between… / what are some ways to approach…)
Basically how non-tech people thought search engines worked at the beginning of this century.
When Im feeling blue I ask it to say nice things about me.
I write a lot of emails for work, but I’m not the most eloquent writer, so I get wordy.
I sometimes feed my email into chatgpt and ask it rewrite it to be more concise, but remain in a friendly but professional tone. Boom, done.
Well I even subscribed to it at some point. But they really dumbed down the v4 model, so it’s basically on par with the V3 model. And since open source models have become good, there’s no point in using ChatGPT anymore.
What open source models? Can you recommend one?
Hugging chat. Basically the state of the art conversational LLMs are hosted there for free.
Especially Facebook’s Llama2 and any checkpoints of that model are solid.
And if you are a coder, I think there was one called “Starchart” just aimed at code autocomplete, which seems to be a good start.
But ofc as alehc already mentioned, these all exist on Hugging Face and you will find a treasure trove of AI models on there regarding every possible implementation.
Here too
Well, if they would not nerf it maybe it woyld not go so much down
It’s abysmal at this point… Whatever they did to it, the results are now awful and far more inaccurate than they were a few months back.
Idk man I’ve been having a blast with the API and gpt4. Once I get it workin you’d basically get access to gpt4 for pennys. Plus if you’re real whacky and pay for a per token subscription from elevenlabs you’ll have a voice assistant too
Who are these people who have datos installed and opted in to the survey their net usage?
I use it when I need it to speed up python scripting for CG applications, but I don’t need it on a constant basis. It could be weeks or more between when I’ll dig into it.
I used it today and the answer gave me was right, but the explanation for how it got there was so ridiculously bad.
chatgpt browser extension is needed for news to provide summary on BS articles lengthened with meaningless filler words.
Yeah this is exactly the sort of use I think we’re going to see become most common, especially tools that allow you to ask about news, events, social media, etc. ‘what have my friends been talking about recently? Did Janet ever post a resolution to the thing with her mom?’
‘What happened with those ships that crashed in the canal, did they get cleared?’ or ‘im thinking of going to Paris in July, what’s been happening there and what’s the weather like’ and it’s able to sum it up, tell you about things you might want to know more about based on your interests.
Also being able to actually sort and filter ‘just miss me with the football news’, ‘just ignore any posts about my friends kids or bullshit about anniversaries, birthdays and stuff’ - might make social media enjoyable to use.
Is there any chance that this is fallout from the Reddit API changes? Lots of people were training LLMs using Reddit. If you can’t do that anymore, then that would cause a decrease in use. Right?
That was way too recent. And it wouldn’t affect the users of GPT directly, only the training, which wasn’t using super-recent data to begin with anyway.
Tracks with my experience - I played around with it when it first went live and then got bored and moved on, haven’t been back to it in months.
Removed by mod