|
Post by Caulder Melhaire on Feb 23, 2023 15:03:35 GMT -6
Yeah, no, I agree completely. It's a pretty bunch of words, but there are far too many contradictions and threadbare analogies packed in there to sway my vote. For real, trains?! Not an option?! Let's look at that note about oil companies running the show and find the real reason why this country gave up on its rail systems. For that matter, let's look at any of the myriad nations with massive public transit and railway systems who now have an easier and cheaper way of getting from A to B than we could ever dream. I'm one of the many who took my ass to the east coast and back on 180 dollars from riding the train, instead of hundreds for a round trip on a plane, or renting a car, and I cannot BELIEVE that we haven't developed it more.
I get being excited about technological advent, but this entire view of why we should roll over and accept the chatbot overlords in incredibly limited in world view. Scribes still exist in so many cultures. Oral traditions still exist. And - as someone who formerly held this specific view and then jumped into the world of digital art - anyone who says that digital artists, who spend years mastering layering, light balance, saturation, and a handful of digital brushes, are less in talent than a traditional artist have never tried scratching out a portrait with a tablet. There is so much comparable skill between those two variations on the genre that is simply not present in AI ghostwriting.
For me, I'm caught on something that's been stated a few times now with minimal contradiction: whatever these people are doing with AI isn't exactly writing, but more like directing or tweaking a bunch of autofilled output. So what are these people then? Because they aren't writers. Programmers? Directors? Producers? And what that tells me is that these are two very different markets we're dealing with, and the only reason for the AI one to roll into our territory is because they can try to steamroll us to make a quick buck. And folks will jump in head first to defend them because "progress is inevitable?" That's such a cop out to putting ethics and empathy over money, and an attitude that, frankly, has led to a lot of shitty legislation and living situations for a lot of people.
This right here is the big one for me. Progress that lifts up everyone involved is fantastic. But yet another big tech project that's going to shove people off the radar in favor of lining the pockets of folks who already have the advantage over everyone else is a tired old tale. This is quickly going to stop being about the so called natural progression of things and turn into another way to shift the profit margins away from the people doing the bulk of the foundational work.
So yeah, I'm not convinced enough to sway my support for banning this kind of writing from our site. We're the former market, there, and I don't appreciate them butting their way into the way that too many people here are trying to make a living, make their voices heard, just because it's easy. It's also easy to go create your own market for AI stories and dominate there until you can get ahead of the game by more than just riding the current trend. This is a community of writers, who came here to learn the old fashioned way and hone our skills to match. This is the Legend Fire Critique Community for creative writers, not the Legend Fire programming community.
Anyway.
Something I am very curious to see is how copyright law is going to handle AI art. Cause like... that's gonna be a disaster, yeah? If one of these developers decides they want to try copyrighting their stuff, what's the legal case for/against allowing them to. And how the heck do you even track that LOL
|
|
Deleted
Deleted Member
Posts: 0
|
Post by Deleted on Feb 23, 2023 18:55:17 GMT -6
With every advent of new technology, certain jobs people spent decades learning becomes obsolte. And no, I'm not flippantly saying "Well learn something new". I'm an accountant, and accounting is one field that AI is going to eliminate, and soon, and it's scary. It's scary and sucky for me to realize this.
But pull back. The Printing Press made scribes obsolete, but we would never say "Oh, we need to just get rid of all printing presses and bring that career field back". No, the Printing Press pushed us into an era of learning unlike anything before it. It changed the world!
There is always a period of growing pains when a new technology comes along and shakes things up, but more ofthen than not, once the growing pains are done, which admittedly can take a generation of so, society as a whole is better off for it. The main problem these days is we have had too many of these technological advancements in the last fifty years. before the advent of the railroad, there were only a few life changing inventions. In the last hundred years, we've had electric lights, automobiles, radios, televisions, telephones, smart phones, computers, internet, the list goes on and on. The computer I'm writing on has more computing power than the one they sent people to the moon with in 1969!
I'm just saying, I understand where the vitriol is coming from, but in the end, it'll be ok. The dust will settle, and writers will continue to write, and other people will use ChatGPT or other such products to create their own content. There are almost 8 billion people in the world, and all of them want to write books. There are millions of books being written every year. We're already over saturated with stories of every kind. And yeah, 99.9999% of those books and stories suck because they're written by amateurs who only THINK they can write, and don't even bother editing their work before publishing it on Amazon, but let's face it, wading through the crappy indie books to find the good ones is hard as hell already. If all those poeple used ChatGPT, at least their work won't have so many spelling errors!
In the end, we write because we love it. I write because I love it. I don't write to make any money, because I don't make money from it. Which, by the way, is great when you have a wife who likes to say things like "Why do you even bother? You're not going to do anything with the book anyway." Just gives me all kinds of enthusiasm to keep going.
But yeah, make a rule that people not use it in contests here. That's fair. I'd thought about getting ChatGPT to write a short story for me and posting it in the fiction section, with a full disclaimor that it was created by AI to get people to read it and critique what the Ai was capable of, but I thought maybe it wouldn't be recieved very well right now. Maybe some other time, lol.
|
|
|
Post by RAVENEYE on Mar 3, 2023 12:44:24 GMT -6
But yeah, make a rule that people not use it in contests here. That's fair. I'd thought about getting ChatGPT to write a short story for me and posting it in the fiction section, with a full disclaimor that it was created by AI to get people to read it and critique what the Ai was capable of, but I thought maybe it wouldn't be recieved very well right now. Maybe some other time, lol. There's no reason we can't add an AI Content subforum. Sounds interesting, in fact. Better, the more familiar we are with AI-made content, the better equipped we'll be in recognizing it being used in our contests and other publications. So gimme a bit to process this and I'll see about adding a place for you to embark on this experiment. Hell, I might even join you.
|
|
Bird
Counselor
Posts: 350
Custom Title: World Creator and Destroyer
Preferred Pronouns: they/them/their
HARD: 1700
MEDIUM: 400
EASY: 110
|
Post by Bird on Mar 3, 2023 14:07:30 GMT -6
Considering my points I made about ChatGPT is mostly being ignored, I'm fully against the project and giving it space on the forum. This is a creative writing and critique forum. Not a place to post regurgitated chatbot posts. Again, ChatGPT is absolutely 100% INCAPABLE of recognizing whether it puts out fact or fiction, and often will put out falsehoods because it can't discern that. (I already shared my proof of its failings several posts ago). This is a grift by tech companies to spam us with low quality articles to pad their search rankings and get a quick buck. It's also destroying writer's chances for places to sell their work as many outlets have closed submissions while they sort out a way to deal with the overwhelming spam submissions of shitty chatGPT stories. The harm of this is very real and why I find us having a space for it to be somewhat unethical. These aren't helping people tell a story as again, it's regurgitated prose taken from other people's works. Even the court system is starting to go against this craze and have ruled that these AI art and AI writings cannot be copyrighted due to the nature of how the chatbots and generation engines work. If folks are using it to help generate ideas, then sure - that's the only thing the chatbots do that is somewhat useful.
If people are using it to ask questions? Stop doing that as the answers (as many are proving now in their research and studies) are false and riddled with errors, and it's not possible to program these chatbots with an awareness of what they output.
If people are using them to write an entire story themselves, then please for the love of everything stop trying to pass that off as your own and submitting it places. It's spam at that point. Keep it to yourself and stop flooding the markets and shutting down submission windows for the rest of us. In order to tell a story, people need to understand the basics of storytelling, but the chatbot engines don't have that capability to recognize those elements. Instead, in my research with them, they keep falling back on really frustrating tropes and limited story structure designs. Considering how the training of these things work, it's not possible to program that awareness of their output. (And to be clear, because of the large training sets and how long it takes to train these engines - months to years in some cases - programmers often don't fully understand the code that builds up during training.) There's also been issues with the ethics of how the training sets are being formed, meaning most training sets are scraped from the web and often contain copywrited work. So there's plagiarism going on here as well. These projects were built without consideration of any ethics, and until these projects grapple with that, then these chatbots are more a hindrance than a help. Sometimes when new tech appears, it's not always a good thing. And if we do NOT consider the pros and cons thoroughly, we can end up hurting ourselves more than helping. But I guess if y'all wanna play around with fire, go right ahead. But don't say I didn't warn you.
|
|
|
Post by RAVENEYE on Mar 3, 2023 14:33:10 GMT -6
Considering my points I made about ChatGPT is mostly being ignored, I'm fully against the project and giving it space on the forum. This is a creative writing and critique forum. Not a place to post regurgitated chatbot posts. Again, ChatGPT is absolutely 100% INCAPABLE of recognizing whether it puts out fact or fiction, and often will put out falsehoods because it can't discern that. (I already shared my proof of its failings several posts ago). This is a grift by tech companies to spam us with low quality articles to pad their search rankings and get a quick buck. It's also destroying writer's chances for places to sell their work as many outlets have closed submissions while they sort out a way to deal with the overwhelming spam submissions of shitty chatGPT stories. The harm of this is very real and why I find us having a space for it to be somewhat unethical. These aren't helping people tell a story as again, it's regurgitated prose taken from other people's works. Even the court system is starting to go against this craze and have ruled that these AI art and AI writings cannot be copyrighted due to the nature of how the chatbots and generation engines work. If folks are using it to help generate ideas, then sure - that's the only thing the chatbots do that is somewhat useful.
If people are using it to ask questions? Stop doing that as the answers (as many are proving now in their research and studies) are false and riddled with errors, and it's not possible to program these chatbots with an awareness of what they output.
If people are using them to write an entire story themselves, then please for the love of everything stop trying to pass that off as your own and submitting it places. It's spam at that point. Keep it to yourself and stop flooding the markets and shutting down submission windows for the rest of us. In order to tell a story, people need to understand the basics of storytelling, but the chatbot engines don't have that capability to recognize those elements. Instead, in my research with them, they keep falling back on really frustrating tropes and limited story structure designs. Considering how the training of these things work, it's not possible to program that awareness of their output. (And to be clear, because of the large training sets and how long it takes to train these engines - months to years in some cases - programmers often don't fully understand the code that builds up during training.) There's also been issues with the ethics of how the training sets are being formed, meaning most training sets are scraped from the web and often contain copywrited work. So there's plagiarism going on here as well. These projects were built without consideration of any ethics, and until these projects grapple with that, then these chatbots are more a hindrance than a help. Sometimes when new tech appears, it's not always a good thing. And if we do NOT consider the pros and cons thoroughly, we can end up hurting ourselves more than helping. But I guess if y'all wanna play around with fire, go right ahead. But don't say I didn't warn you. Thoroughly noted. Your arguments are not being ignored, not in the slightest. Most of us are in huge agreement with your points and your proofs. I do believe it's possible to create a place for us to observe AI behavior AND promote it as something we are wary of, even in disagreement with. It is also possible for us to use the AI's crap results against itself in our response to it here at the forum. Once it's crap results have been posted (be it a summarized plot passing itself as a story, or flawed nonfiction), we can pick it apart and make a display of its faults, and therefore make an argument as to why subbing the material to markets is a problem for human writers. The goal is NOT to promote it as the be-all-end-all to us as writers. The very opposite, in fact. I'm eager to have fun tearing it apart and disproving its results. And for our mods especially to be able to recognize AI content, which I feel is extremely important for the future of the forum and our writing vocation.
|
|
Deleted
Deleted Member
Posts: 0
|
Post by Deleted on Mar 4, 2023 6:14:21 GMT -6
I tried to write a whole story with it, and it does work, it just needs lots of love. Lots and lots. It is easy to get into the trap of "Write the next scene where blah blah blah happens", and just go with it, rather than thinking out the scene and all the little nuances for the story. It always tried to finish up the story at the end of the scene, and I'd have to go back and rewrite the prompt to specify what happened at the end of the scene. with some practice, I think it would be possible to get an entire story out that isn't TOO bad. And it did randomly bring up a few plot points that I actually liked. If you think of yourself like a director, fine tuning your prompts to detail everything you want in each scene, then it can come out with some pretty good stuff.
That said, to be honest, the main thing I will be using it for, besides helping me hash out plots and outlines, is writing blurbs, pitches, query letters, and all those things that I have always dredded. And it does those very well. I had an idea for a futuristic sci fi book that I was using ChatGPT to help me hash out (not one that I'd let it write, just help me outline and whatnot), and I just asked it to write a 200 word pitch for the story it had been helping me create, and it did, and it was great. Well, at least, it made ME interested in reading the book. Which is what a Pitch is supposed to do, right?
Now, it got a couple of the details wrong, like the year it was set, which AI is in charge of the facility and the name of the book. But the thing is, I never named the book. It just gave me a name. The story is loosely based on Norse mythology, only making it science fiction and set on a planet called Valhalla, and so ChatGPT decided to name the book "Valhalla's End". Which...is actually pretty good.
The thing is, ChatGPT and its ilk aren't MEANT to write fiction. It's meant to be an assistant. Yeah, reserch isn't necessarily the smartest thing to do, but I have a lawyer friend who is using it to help him write contracts, like Wills and the like. Sure, he has to read it over to make sure it's still accurate, but that's a heck of a lot faster than writing out the thing by hand. My dad has a career coaching and resume writing business, and he writes cover letters for each of his clients as well, and he's started using it to write the cover letter for him, because it always takes him a good deal of time to do that, and ChatGPT can do it in about ten seconds. Again, he has to read it over to make sure it's accurate, but that's fine.
it's got a lot of potential to make a lot of lives easier. I'm just wondering where it's going to be ten years from now.
|
|
|
Post by Valhalla Erikson on Mar 4, 2023 6:48:56 GMT -6
When using ChatGPT it is useful if you find yourself in a writer’s block you need it to point your story in the right direction but the one thing you shouldn’t do is leave it to its own devices. Work with the AI.
|
|
Bird
Counselor
Posts: 350
Custom Title: World Creator and Destroyer
Preferred Pronouns: they/them/their
HARD: 1700
MEDIUM: 400
EASY: 110
|
Post by Bird on Mar 4, 2023 11:38:59 GMT -6
. Once it's crap results have been posted (be it a summarized plot passing itself as a story, or flawed nonfiction), we can pick it apart and make a display of its faults, and therefore make an argument as to why subbing the material to markets is a problem for human writers. The goal is NOT to promote it as the be-all-end-all to us as writers. The very opposite, in fact. What. I get what you're saying in the rest of your response, but I'm highlighting the above because I shared Proof Of This Already. We don't need to reinvent the wheel here/redo-the-research. The harm is being done. (For nonfiction, take a look at that article I linked in my initial reply to the thread about CNET using AI to write flawed and wrong articles discussing nonfiction topics and the research proving that. For fiction, I linked Clarkesworld doing research and proving the harm.) Clarkesworld and others that are open-market submissions shut down submissions for unknown amount of time and listed this spamming as the issue. Why do we need to prove the harm this does to subbing the material to markets when the magazines like them already proved it? So to make it clear why I'm hammering this so damn much (I don't know if I was clear about it prior): These are all magazines I'd carefully researched and been preparing queries for my short stories -- it's taken me forever to do this since I'm ill -- now I can't submit to them because they slammed their doors shut citing how they're being spammed by ChatGPT submissions. (Before someone jumps on me, YES, I verified they are all closed and all due to this same issue. They were all always-open-markets, why I picked them since my illness is variable.) So yeah, I'm Pissed at this situation. I am being harmed already. Now I'm back at square one and am too ill at the moment to do anything but be pissed off at how cavalier some folks have been about this. (Not saying you're being that at all, as I know you aren't. But some folks are.) I'm not the only one -- the writer networks on Mastodon are blowing up with furious authors who now can't sub their works. That's why I got no patience for it. (Edit: I feel like I'm repeating myself so I give up.) If you wanna use it to train yourselves on how to spot this stuff? Ok, that's valid. Go for it.
|
|
Deleted
Deleted Member
Posts: 0
|
Post by Deleted on Mar 4, 2023 15:46:34 GMT -6
I read an article that said, and I may agree with them once I consider it longer, that the only reason this is disrupting anything is because of the way our society is set up. We need money to live, and we trade our labor for money. In the art world, that translates to us trying to sell our art for money so we can live to make more art. Because of this, when automation takes over an industry, it is horribly disruptive, because then people are out of a job, and therefore out of their livelihood.
If we didn't have a society that was reliant on money to survive, and who traded Labor for Money, then it wouldn't be a big deal. No one would submit AI generated stories to magazines, because why would they if they weren't getting paid for it? People would do their own art and share it with whomever they would. Giant marketing companies wouldn't choose Brandon Sanderson over Joe Shmo, because they aren't making money off that. You wouldn't have people chunking out garbage content they aren't passionate about because they "have to pay the bills". How many of us have taken jobs that we hate simply because we have bills to pay and it was the only one offering (raises hand).
So maybe the issue isn't so much the Automation or the AI's, as much as the way society is set up. I'm not a socialist, and I don't mean this to devolve into a talk on economics, but I've always been intrigued by the concept of Universal Basic Income, wherein every individual is paid a certain amount, typically just enough to barely scrape by (usually said to be roughly $10 grand a year), as a matter of course with no stipulations or qualifications. Every person, rich and poor, gets this amount. For the poor and middle class it is a suppliment to their income, for the rich it is a small tax break, but everyone gets it, replacing all forms of welfare. No way to cheat the system, cause there's no qualification. You just get the money every month.
You get a society that can make that work, and all of a sudden "Working a dead end job for ends meat" isn't as important. Maybe you don't want to just survive, so you go to get a job, but if you're going to work at Mcdonalds for $12 an hour, which is 24,000 a year, and combine that with your UBI, and all of a sudden your mcdonalds job is making $34,000 a year . If you're married, that $10k is $20k, cause there's two of you. Now if one of you is working at Mcdonalds, you're bringing home $44k a year, which is actually livable! Depending on where you live, of course. And the people who don't want to work can just do their art.
I don't know if that's a solution, or if it would even work in a large scale, but it's worth thinking about. Maybe the issue isn't AI's are evil. Maybe it's just not compatible with the society we have built, and maybe that society should be updated to fit the times.
|
|
|
Post by Valhalla Erikson on Mar 4, 2023 15:56:58 GMT -6
The only thing about ChatGPT I am not a fan of is how it refuses to tell a story that might seem offensive. I know enough about writing to know that no matter what you do there will be someone out there that will be offended by what you put in your novel. If you have a story to tell the tell it. An author shouldn't be handicapped and limit their potential.
|
|
|
Post by Alatariel on Mar 4, 2023 16:13:48 GMT -6
. Once it's crap results have been posted (be it a summarized plot passing itself as a story, or flawed nonfiction), we can pick it apart and make a display of its faults, and therefore make an argument as to why subbing the material to markets is a problem for human writers. The goal is NOT to promote it as the be-all-end-all to us as writers. The very opposite, in fact. What. I get what you're saying in the rest of your response, but I'm highlighting the above because I shared Proof Of This Already. We don't need to reinvent the wheel here/redo-the-research. The harm is being done. (For nonfiction, take a look at that article I linked in my initial reply to the thread about CNET using AI to write flawed and wrong articles discussing nonfiction topics and the research proving that. For fiction, I linked Clarkesworld doing research and proving the harm.) Clarkesworld and others that are open-market submissions shut down submissions for unknown amount of time and listed this spamming as the issue. Why do we need to prove the harm this does to subbing the material to markets when the magazines like them already proved it? So to make it clear why I'm hammering this so damn much (I don't know if I was clear about it prior): These are all magazines I'd carefully researched and been preparing queries for my short stories -- it's taken me forever to do this since I'm ill -- now I can't submit to them because they slammed their doors shut citing how they're being spammed by ChatGPT submissions. (Before someone jumps on me, YES, I verified they are all closed and all due to this same issue. They were all always-open-markets, why I picked them since my illness is variable.) So yeah, I'm Pissed at this situation. I am being harmed already. Now I'm back at square one and am too ill at the moment to do anything but be pissed off at how cavalier some folks have been about this. (Not saying you're being that at all, as I know you aren't. But some folks are.) I'm not the only one -- the writer networks on Mastodon are blowing up with furious authors who now can't sub their works. That's why I got no patience for it. (Edit: I feel like I'm repeating myself so I give up.) If you wanna use it to train yourselves on how to spot this stuff? Ok, that's valid. Go for it. I think it would mainly be an exercise for us to be able to better recognize the writing AI's produce so we, the mods, can spot them if used on the forum. I know I haven't really seen many stories produced by ChatGPT because I'm not actively looking for them or studying them. It would be handy to see how others here pick them apart and what obvious patterns we can pick out. I agree that it could be fun to do, a challenge for me personally and I'm sure others would think it an interesting challenge as well. But I hear you loud and clear. The mass use of this tool is starting to cause harm to writers. I do hope that these magazines and publishers find a way to work around this issue. It seems like when we do come up with new tools like this the industries it impacts find ways to adjust...just takes time and in the meantime we are the one's who suffer most.
|
|
|
Post by RAVENEYE on Mar 4, 2023 17:07:38 GMT -6
What. I get what you're saying in the rest of your response, but I'm highlighting the above because I shared Proof Of This Already. We don't need to reinvent the wheel here/redo-the-research. The harm is being done. (For nonfiction, take a look at that article I linked in my initial reply to the thread about CNET using AI to write flawed and wrong articles discussing nonfiction topics and the research proving that. For fiction, I linked Clarkesworld doing research and proving the harm.) Clarkesworld and others that are open-market submissions shut down submissions for unknown amount of time and listed this spamming as the issue. Why do we need to prove the harm this does to subbing the material to markets when the magazines like them already proved it? So to make it clear why I'm hammering this so damn much (I don't know if I was clear about it prior): These are all magazines I'd carefully researched and been preparing queries for my short stories -- it's taken me forever to do this since I'm ill -- now I can't submit to them because they slammed their doors shut citing how they're being spammed by ChatGPT submissions. (Before someone jumps on me, YES, I verified they are all closed and all due to this same issue. They were all always-open-markets, why I picked them since my illness is variable.) So yeah, I'm Pissed at this situation. I am being harmed already. Now I'm back at square one and am too ill at the moment to do anything but be pissed off at how cavalier some folks have been about this. (Not saying you're being that at all, as I know you aren't. But some folks are.) I'm not the only one -- the writer networks on Mastodon are blowing up with furious authors who now can't sub their works. That's why I got no patience for it. (Edit: I feel like I'm repeating myself so I give up.) If you wanna use it to train yourselves on how to spot this stuff? Ok, that's valid. Go for it. I think it would mainly be an exercise for us to be able to better recognize the writing AI's produce so we, the mods, can spot them if used on the forum. I know I haven't really seen many stories produced by ChatGPT because I'm not actively looking for them or studying them. It would be handy to see how others here pick them apart and what obvious patterns we can pick out. I agree that it could be fun to do, a challenge for me personally and I'm sure others would think it an interesting challenge as well. But I hear you loud and clear. The mass use of this tool is starting to cause harm to writers. I do hope that these magazines and publishers find a way to work around this issue. It seems like when we do come up with new tools like this the industries it impacts find ways to adjust...just takes time and in the meantime we are the one's who suffer most. Indeed. And, yes, Bird, I read that article as soon as it came out, and I voiced my fury over the entire AI closing down our zines way back in this thread, days before you posted the link (pretty sure it's the same link). Being upset over this is extremely valid, AND being able to learn how the AI functions and how to recognize it on our own forum is a matter of survival for us as writers right now. Whether the technology gets shut down eventually or not is a matter for months and years to come. Right now, as we all see, it is an issue for us to learn how to deal with, and we need to understand it, not only give it the cold shoulder because it's causing horrible repercussions for us right now. But I totally get your choice to have nothing to do with it. I respect that completely. What I'm seeing here are two valid ways of fighting this thing. One, boycotting it completely. Two, fighting the fire by getting ahead of it and causing a little backburn. Neither are bad. Both are active in different directions. Honestly, if I had the funds and/or know-how, I would implement the software that tests all our forum content for AI creations, just like magazines like Clarkesworld are having to do. We don't have that luxury, so we gotta figure out some other way.
|
|
|
Post by Valhalla Erikson on Mar 4, 2023 17:22:21 GMT -6
Probably going to be in the minority here. Despite my own negativities and concerns there are some uses for the tool. For storytellers its brainstorming a summary for your novel or help you get you out of your writer's block funk. Just don't be too dependent on it and have it write your whole work for you because it wouldn't be fair on you, as an author, that an AI did your Story Quest for you.
|
|
|
Post by Mazulla on Mar 5, 2023 0:23:36 GMT -6
Probably going to be in the minority here. Despite my own negativities and concerns there are some uses for the tool. For storytellers its brainstorming a summary for your novel or help you get you out of your writer's block funk. Just don't be too dependent on it and have it write your whole work for you because it wouldn't be fair on you, as an author, that an AI did your Story Quest for you. I pretty much agree with you. I'm somewhere in the middle. AI absolutely shouldn't be abused and left to write entire stories, especially turning right around and submitting these "stories" to publishers or otherwise. People are definitely in the right to be angry and to fight against it.
But if a person wants to try it as a curiosity? To test its limits? To brainstorm or help them out of writer's block? Go for it, imo. But it should be treated as a tool and treated with respect.
I did try ChatGPT for one night a couple of weeks ago. I gave it a scenario and asked it questions which I already knew the answers to (more or less), but I wanted to see where it went. i.e. Questions about a magic system and a society I'm developing. Most answers/explanations were pretty generic, not too in-depth, and a bit repetitive, with only a few responses being interesting or takes I hadn't considered. For me, it falls in the "curiosity" category and I can't see using it often, but maybe occasionally for brainstorming purposes when I'm really stuck. It seemed to work better with what/how/why questions vs. asking it to write an entire scenario for you.
AI is definitely a problem, especially while it's still new and trending right now. I also think it's here to stay and it'll be a balancing act that will need to be carefully navigated in the future.
|
|
|
Post by Valhalla Erikson on Mar 5, 2023 10:19:56 GMT -6
Probably going to be in the minority here. Despite my own negativities and concerns there are some uses for the tool. For storytellers its brainstorming a summary for your novel or help you get you out of your writer's block funk. Just don't be too dependent on it and have it write your whole work for you because it wouldn't be fair on you, as an author, that an AI did your Story Quest for you. I pretty much agree with you. I'm somewhere in the middle. AI absolutely shouldn't be abused and left to write entire stories, especially turning right around and submitting these "stories" to publishers or otherwise. People are definitely in the right to be angry and to fight against it.
But if a person wants to try it as a curiosity? To test its limits? To brainstorm or help them out of writer's block? Go for it, imo. But it should be treated as a tool and treated with respect.
I did try ChatGPT for one night a couple of weeks ago. I gave it a scenario and asked it questions which I already knew the answers to (more or less), but I wanted to see where it went. i.e. Questions about a magic system and a society I'm developing. Most answers/explanations were pretty generic, not too in-depth, and a bit repetitive, with only a few responses being interesting or takes I hadn't considered. For me, it falls in the "curiosity" category and I can't see using it often, but maybe occasionally for brainstorming purposes when I'm really stuck. It seemed to work better with what/how/why questions vs. asking it to write an entire scenario for you.
AI is definitely a problem, especially while it's still new and trending right now. I also think it's here to stay and it'll be a balancing act that will need to be carefully navigated in the future.
Yeah, I find it insulting as a writer that a novel is being turned to a publisher and the writer claims they wrote it, despite that the AI did the majority of the work. With AI Art, I find it different IMO. While you haven't actually drawn the art you still contribute to the creation of it by incorporating the right code to produce the kind of art you want. And, speaking from personal experience, producing perfect AI art can consume a lot of frustrating hours to get the art right. With ChatGPT or other tools like it I feel an author is limiting their creative process if they just let the tool do the work for them. Rather than working WITH the AI tool.
|
|