you are viewing a single comment's thread.

view the rest of the comments →

[–]supermouse35 82 points83 points  (5 children)

This is my biggest fear right now. A significant chunk of my work involves writing, and I'm terrified that AI is going to put me out of business.

[–]Electronic_Lock325 55 points56 points  (3 children)

I cried too and felt so discouraged when I wrote a poem for a contest. I plugged it into AI, and the poem was much better. I don't want to send it, though, because it doesn't have my own emotion in it.

[–]describt 17 points18 points  (2 children)

That's a horrible opinion to have about your own writing.

I value your writing because it was written by a human being, with real emotions--something no computer could ever fake without plagiarism--and that's beautiful.

It's like a painter condemning his work by comparing it to a photograph. It's art precisely because it is projected through a flawed medium, and not a thousand masterworks regurgitated through a computer.

Look closely at AI writing, and it looks like a thesaurus threw up on the page. Countless bad word choices, in awkward syntax don't make any sense to the human ear.

We know that just because 2 words are synonyms they aren't necessarily equal, and that 1 of those words fits the mood and meaning of the sentence.

"To exist, or not to exist. That is the interrogative."

Be you and write your genuine experience, because you are beautiful.

[–]stunna_cal 43 points44 points  (13 children)

Then us plebs don’t stand a chance. I never did. Was never a good writer or speaker.

[–]isthebuffetopenyet 42 points43 points  (4 children)

Hang on, you've got this backwards, the things which you are proficient at you will now be able to enhance through use of AI.

Great business idea but unable to communicate it effectively, AI to the rescue. Fabulous career history, but unable to compose a resume, AI to the rescue.

I think that people who possess skills which AI can't replicate are about to have a string future.

[–]stunna_cal 17 points18 points  (1 child)

Oh don’t get me wrong. I’m gonna use the shit out of AI lol. More doom and gloom for future generations, wealth gap and all.

[–]isthebuffetopenyet 1 point2 points  (0 children)

That's true. Only way to help them is to make as much money as possible to pass the wealth down. Tragic circumstances.

[–]dirtyxglizzy -1 points0 points  (1 child)

Ya but how are you gonna sell that to a company who also has access to the same ai.

[–]Subject-Orchid-463 12 points13 points  (7 children)

Become a tradesman!!!

[–]Wise-Masterpiece-590 29 points30 points  (2 children)

It's not for everyone brother, and even if it was at times writers pay us to build their buildings. As fields like there's and others diminish those effects will reflect on the trades as well. Not saying trades aren't good, I love mine, but preaching become a tradesmen only lasts so long.

[–]pinkfootthegoose 15 points16 points  (1 child)

but preaching become a tradesmen only lasts so long.

and many will wreck your body by 40 if you don't move to management. I did my stint of blue collar work, no thanks.

[–]Ha1rBall 24 points25 points  (0 children)

I grew up in, and live in, a very trade heavy area. 8 out of 10 people in the trades are either pill heads, alcoholics, or their bodies are destroyed. The trades aren't for the faint of heart. There is a reason that most of them pay well.

[–]ecclectic 12 points13 points  (0 children)

As others have said, this is generally not good advice to hand out. I've dealt with way too many people over the past 3 years who thought they could make a go of "getting into a trade." Unless someone is stepping into a union position as the relative of a shop steward, union rep or otherwise untouchable position, its rough out there and most people can't hack it.

Everyone is desperate for skilled trades, but no one has the time or money to train humans. The entry level positions are being filled by robots, and the mid to high level positions are filled by gen Xers and boomers who the companies can't afford to let retire.

That largely leaves starting out with mom and pop shops who can't afford to pay high wages when every job they take on costs 50-75% more because they're training an apprentice.

[–]alphawolf29 1 point2 points  (0 children)

People ask me all the time if I'm afraid automation is going to take my job..... my job is to fix automation when it fails... my industry is already as automated as it can really get.

[–]Five_Decades -1 points0 points  (0 children)

It's hard on your body, and a lot don't pay well. Plus as the market floods, wages will drop.

[–]WifeofBath1984 22 points23 points  (0 children)

This is what worries me. I fancy myself a writer and I absolutely am a bibliophile. Terrifying implications for the future of literature.

[–]suprbert 28 points29 points  (33 children)

I think about all the people coming out of college with computer science degrees. As I understand AI, which is to say, about as much as the average history major, the demise of those types of jobs is inevitable now.

[–]FM-96 14 points15 points  (5 children)

Perhaps I'm biased, being a software engineer myself, but I really don't think so. I think our jobs are actually among the ones benefitting the most from AI.

AI can semi-reliably aid us, but it can't reliably replace us. Computer programs aren't like essays or artworks; they don't just need to seem right and look good, they actually need to be semantically correct. AI (being "just" sophisticated word predictors) can't guarantee that, you always need a human double-checking and validating the generated code.

[–]Schuben 4 points5 points  (4 children)

Yeah, i got a response from Chat gpt 4 that included a completely fictitious parameter that just happened to neatly solve the problem I was having. Sadly it didn't actually exist and the real solution was completely different. AI can be very confidently incorrect and you just have to be aware of this and check it's work. It has helped find new ways to approach solutons or give me a very good framework to build off of but rarely is it actually correct for what I'm working with.

[–]TheGuyfromRiften 4 points5 points  (3 children)

It's the "black box" problem of generative AI. Since they don't show their work, you have absolutely no way of corroborating the process of an AI and checking if the underlying knowledge it is extrapolating on is false.

What's more, even the developers will have no idea how an AI got to an answer because the AI is teaching itself without humans involved.

[–]suprbert -1 points0 points  (1 child)

AI is teaching itself without humans involved?

This sounds like a recipe for disaster. Couldn't a small error in a system like that get compounded to the point of rendering whole sections of AI knowledge into nonsense?

I'm out of my depth on this topic, but I appreciate this conversation.

[–]TheGuyfromRiften 1 point2 points  (0 children)

Essentially humans give AI the data it needs to learn from. AI then uses algorithms and logic that developers have also given them (essentially teaching the AI how to learn).

Then, it tells the AI what generating outputs is, and allows the AI to generate data.

This sounds like a recipe for disaster. Couldn't a small error in a system like that get compounded to the point of rendering whole sections of AI knowledge into nonsense?

I mean its already seen in algorithms and machine learning software that sifts through resumes for hiring. Because the biases that humans have exist in the hiring data, AI learns that bias and in a biased manner spits out output. With humans, usually you can tell if there's bias involved (internal communications, personality towards different races etc. etc.) you cannot with AI which means an AI could be racist and we would never know.

[–][deleted]  (5 children)

[deleted]

    [–]Schuben 2 points3 points  (3 children)

    AI can help with well known and publicly documented programming, such as in a base language or using a code base that is freely available for an AI to train on. You could potentially train a large language model on a private code base but that lacks a lot of the nuance and breadth of information that public documentation has built up so the LLM can't accurately predict what should go next when composing the response.

    I've found it useful to help guide me to functions I wasn't well aware of and had to translate that into the custom code that I work with in order to apply it. You also have to check it's work because it still often uses completely made up methods or adds extra parameters that seem like they belong and would make things very easy for your use case but are just flat out not there in the real code. It likely learned these things from the code people wrote on top of the base code so it thinks these things apply just as well since it's hitting on the same language or system you're using but it's not.

    [–]IfItQuackedLikeADuck 2 points3 points  (0 children)

    First it starts with tools like Personified to supposedly boost productivity , but then reliance on them makes management question roles that can then be handled with 80% AI output and 20% instead just for review

    [–]suprbert 1 point2 points  (1 child)

    What you describe sounds like AI is improving the coding language. Made up methods that don't really exist but that would improve the process if they did.

    Is it possible that this is what will begin to happen? The made up stuff that AI is outputting that actually seems useful will get folded into updates to the coding language?

    This is not my milieu at all, by the way. Just throwing that in there in case I'm ignorant of something considered obvious.

    [–]Schuben 0 points1 point  (0 children)

    Well, I meant more that it was making up standard methods for standard classes that didn't exist and adding new parameters to the methods that aren't there either. It's possible it's 'inspiration' for this was a custom code extension so the format and name is the same but unless you have the customization it doesn't mean anything to the 'out of the box' user.

    I was pleasantly surprised, however, how competent it was with writing code that contained it's own methods and referenced it's own class name and correctly used it's own method names higher up in the code before the method was written below!

    [–]frozen_tuna 3 points4 points  (0 children)

    Exactly. Devs that don't update their skills will fall out of favor, but that's literally been the case since like the 80s. Devs who do update their skill set will be in high demand for decades to come.

    [–]smoozer 21 points22 points  (9 children)

    Did factory employees all disappear when automation started being invented? Nope, the type of job and number of employees just changed.

    [–]Haunting-Ad788 50 points51 points  (1 child)

    The number of employees changing is a massive looming problem.

    [–]smoozer 0 points1 point  (0 children)

    In some industries. Other industries are emerging and growing rapidly.

    [–]dolche93 24 points25 points  (2 children)

    All of them didn't disappear. Plenty of them did, though.

    I run a machine that does the work of at least a dozen people. What are they doing now? Some other job that didn't exist 50 years ago, probably.

    An issue I see is that we aren't creating enough new jobs with AI to replace the ones it's going to make redundant.

    [–]ChickinBiskit 0 points1 point  (1 child)

    It's almost like we need to move past having a job being a requirement for living and participating in society 🤔

    [–]dolche93 0 points1 point  (0 children)

    Right? We're so productive that, as a society, we could choose to have things like hunger or housing be post scarcity.

    Capitalism requires scarcity, though, so good luck.

    [–]atomic1fire 4 points5 points  (1 child)

    I assume what happened was jobs that require a lot of precise repeat manuel labor got replaced with machines, but the machines probably have an Operator who runs the machine, and in some cases performs simple maintenance/repairs.

    So you don't need employees who are really good at that (Thing machine does) unless the machine completely breaks, but you do need employees who can push a button or operate a foot pedal for long periods of time and for a higher quantity of product, while also keeping an eye on defects.

    Plus there may be local or regional requirements that require human employees build or oversee a product being built to qualify it as "region made".

    [–][deleted] 0 points1 point  (0 children)

    QC vision systems can automate the discovery and rejections of defects now.

    [–]Schuben -1 points0 points  (0 children)

    I'm glad my employment prospects are no longer 50% farmer or 50% anything else.

    [–]Congregator 1 point2 points  (3 children)

    I thought this as well, but then again, I’m not a software engineer.

    I have a buddy whose a high level software engineer working for Nvidia, and just wrapped up his PhD in Machine Leaning and creates machine learning algorithms as his job.

    We were hanging out last week, and I had a fear that since I have more recently started to learn how to code I would never have a side hustle because of AI.

    His response to me was that “AI seems very esoteric to someone who isn’t a developer, and AI is only as good as those who are programming it”, and that it completely relies on developers and engineers to maintain itself.

    What I got from him, in the end, is that it’s easy to forget that people have to build, design, and maintain new servers, create new algorithms for problems not yet realized, and make minute tweaks for specific needs that won’t yet be programmed.

    The jobs will evolve, but AI in many ways will stay one step beneath human ingenuity (in his theory), because there are so many people in the world that it’s next to impossible to account for every human element and creative response to a said outlier: anomalies not only occur, but can change the course of society rapidly (consider a sort of “miracle” occurring, and being replicated before the algorithm for said “miracle” is programmed, the whole span of variables needs new algorithms, and this is a dense sort of problem.

    You have to retrain all the models, and who retrains the models as of now? Developers.

    There always needs to be a developer at some point.

    Human Beings are anomalies within themselves, I mean, this is how we get religion / miracles / coincidences that changes whole social/cultural and evolutionary move.

    Consider this, even thought it’s not real as of today: AI is dominating the market place based on our known data, etc.

    Someone with three heads is born, and they can cure cancer with the touch of the hand, and breathe fire on command. This probably isn’t going to happen, but if it did, AI wouldn’t be able to change all of its algorithms to account for that on its own, and how that changes history, nor evolution, nor scientific thought.

    What I’m getting at, is AI, the way we as non-developers think about it, is a little more “science fiction” geared, than what the actual reality is.

    [–]suprbert 0 points1 point  (0 children)

    I should probably back up a step and check my notes on what a computer scientist actually does. At the core of it, it's manipulating information, right? But the practicum of that is coding and developing algorithms and such. Assuming that's right so far, isn't that something that AI can already do much more quickly than a human?

    I had a version of this conversation IRL with my girlfriend earlier; she said that CS people will have MORE jobs the more prevalent AI becomes (a general synopsis of what you're also saying). But, isn't AI and deep learning a specialized field within CS? Like, just because I can drive a car doesn't mean I can pilot a riverboat, though they are both vehicles. Would a CS grad studying whatever general CS is and means, be able to pivot to specializing in the care and maintenance of AI that easily?

    Sorry if this is turning into an "explain like I'm five".

    [–]hillsfar 0 points1 point  (1 child)

    Yes, humans may always be needed.

    But the number of humans needed will decrease exponentially.

    Even as the human population continues growing.

    [–]Congregator 0 points1 point  (0 children)

    You said yourself “even as the number of humans *needed”.

    Needed by who?

    Needed by humans

    [–]unosami -1 points0 points  (5 children)

    Wouldn’t the advent of AI mean we need more programmers to program the AI?

    [–]notMrNiceGuy 7 points8 points  (4 children)

    If the AI is actually good enough to replace competent programmers then it’s likely good enough to program itself. I don’t see AI actually replacing programmers all that soon though.

    [–]unosami -1 points0 points  (3 children)

    That’s my point though. AI is nowhere near able to replace competent programmers at the moment.

    [–]ZorbaTHut 3 points4 points  (1 child)

    Similarly, three years ago it was nowhere near able to replace competent writers.

    [–]unosami 0 points1 point  (0 children)

    And it’s still not. There’s just a bunch of big companies jumping on this AI bandwagon.

    [–]CritterMorthul 2 points3 points  (0 children)

    Damn I need to get a degree before they learn how to detect ai generated texts.

    Easy street intensifies

    [–]Manoj_Malhotra 1 point2 points  (0 children)

    Tell her to take the premed reqs and apply to med school.

    [–]brabarusmark 0 points1 point  (0 children)

    I can see why your daughter would feel that way. AI writing has a way of tricking our brains to see well-structured sentences and a very "professional" use of language that suits writing for an assignment (for example).

    I would actually be in the camp that says English majors (and other language majors) have become more important than before. Today's AI is drawing on the writings of the entire world, many of whom are trained individuals who know how to craft sentences. Tomorrow's AI needs to be trained on the content of today which will come from writers like your daughter.

    [–]Trylena -1 points0 points  (0 children)

    Yeah but without a guide the AI would have been able to get to the same place your daughter got and on top of that we tend to be harsher on ourselves because we always want to become better. AI is good but it still needs a lot of guidance.