Wednesday, September 11, 2024

Managing ideas that turn up unannounced

I don't know about other artists and writers, but I find that I'm most likely to get ideas for my creative projects when I'm least able to act on them. You say to yourself, "I'll write it down later" but more often that not, you'll either forget about it completely or have that frustrating feeling of knowing you had a good idea but now it's gone.

One of the most common times for me to get ideas is at around 2 or 3 in the morning, when I'm in that foggy area between sleeping and waking. I do try to keep some sticky notes and a pen next to my bed for when I have these midnight ideas, or failing that, I can write something in the notes app on my phone. This doesn't always help, though; my sleepy brain is not good at communicating with my awake brain, and more often than not, the incoherent ravings I leave for myself are beyond my ability to decipher. Below is an example of an actual "idea" I found one morning when I woke up (and no, I still don't know what it means several months later). That being said, I have ended up with some truly bizarre notes that I was actually able to figure out the meaning of, so it's still worth a try.

A phone with the notes app open, showing text that says "Jimmy Barnes cat but a jelly fish".

So, there's probably not a lot I can do about the midnight ideas, but I also often get ideas during the day when I'm doing academic work, like marking assignments or revising my thesis (which I should be doing right now SHUT UP DON'T JUDGE ME), or even sometimes when I'm working on a different creative project. For those random ideas, I carry a pocket-sized notepad when I'm away from my computer, and for when I am at my desk, I have a Word document set up for getting the ideas down as fast as possible. It's important to note that these documents are not designed for ideas that are eloquently written or painstakingly crafted into intelligent prose. No, they are for a quick and ugly dump of whatever ideas are rattling around in my brain so I can get them out of my head and move on with whatever I'm supposed to be doing. One of my friends in my writing course many years ago coined the term "brain poo" to describe this process of essentially crapping out ideas at speed, and I liked it so much I still use it today.

The benefit of this process is that you don't have to worry about if the ideas is good or not, or try to figure out how to express it. That's Future You's problem. Once you get the bones of the idea down on paper (or on a blank Word doc), that niggling fear of "what if I forget the idea?" is gone. Later on when you have time, you can open that document and see if the idea is workable (whether it be for a new project or for something you're currently working on). If it is, you can sit down and start expanding on it. If it's not, you can just bin it.

Unfortunately it doesn't help with that itch to just write the story, but it does at least ensure you have some fuel to work with next time you do have the chance to sit down in front of your WIP.

Thursday, September 5, 2024

NaNoWriM-Oh no

At the start of September, NaNoWriMo made clear their position on the use of AI in writing, and though it seems they've edited the wording slightly since the furore around their statement began, they still seem oblivious to why people are so opposed to their views.

Before I launch into my own anti-NaNoWriMo rant, allow me to provide some background information:

I am an academic, and a teacher at a university. I teach a variety of units, but there is one unit in particular that I have taught for more than a decade, both as a tutor and as a lecturer (I have also developed and run an online version of the unit for my university's online-only course branch). So I think it's fair to say I have the experience to recognise trends and patterns in the student cohorts in terms of their achievements.

Based on what I have seen, I firmly believe that the introduction of AI tools such as ChatGPT will set back humanity's development and advancement by a decade or two. When we set a task in class for students, instead of engaging with one another and discussing the concepts, they open up ChatGPT and paste the tutorial instructions into it, and when we ask them to share the answers, they just regurgitate the slush ChatGPT spat out. The problem is that while ChatGPT is very good at producing content that sounds reasonable, it is not good at nuanced thinking or self-reflection or considering scenarios that might fall outside the norm. No matter how many times we pull students up on their incomplete or in many cases incorrect or inappropriate answers that they got from ChatGPT, they would still rather rely on a flawed system than try to apply the content or principles. In other words, they seem to have lost the ability to think for themselves. This isn't just a problem in education but has wider implications for society as a whole.

This is particularly evident when it comes time for me (and my colleagues) to mark assignments. Because many of the students did not actually complete the activities in class, they didn't learn the skills required to complete the assignment tasks properly, which means that instead of meaningful deliverables and insightful analysis of what they have done, we end up with pages and pages of word vomit that use a lot of big and fancy words (if I took a shot every time I saw the word "meticulous" in an assignment that was very clearly not done meticulously, I would have liver damage) but don't actually say anything of any substance or value. It's been a few semesters since ChatGPT became widely accessible and when I say that the average student marks have dropped by a full grade since that point, I am not exaggerating.

So when I saw that NaNoWriMo had come out in support of people using AI in their 'writing', I wasn't particularly surprised because of how problematic they've been in recent years (and I'd already decided I wasn't going to participate again because of the appalling way they handled those incidents), but I was disgusted.

I've linked to their statement at the top of this post, but I want to focus on three sections in particular that stood out to me.

"NaNoWriMo does not explicitly support any specific approach to writing, nor does it explicitly condemn any approach, including the use of AI... We fulfill our mission by supporting the humans doing the writing."

Aside from how pathetic and wishy-washy this comment is, it's contradictory and also demonstrates a lack of understanding about what AI actually is, and how it works. Content generated by ChatGPT and similar AI tools doesn't just magically come from nowhere. It is built on stolen work. Actual artists and writers created this original content, and ChatGPT just chews it up and spits it out without providing any acknowledgement or compensation to the human beings without whose work it couldn't exist. You can't claim to "support the humans doing the writing" when you allow or encourage the use of a tool that does the exact opposite of supporting actual human creators.

On a side note: Writers who complain on social media about your writing being fed into AI but then include AI-generated 'art' in your social media posts? You are part of the problem. You cannot claim to be upset about your work being stolen when you are turning around and doing the exact same thing to other creatives.

"We believe that to categorically condemn AI would be to ignore classist and ableist issues surrounding the use of the technology, and that questions around the use of AI tie to questions around privilege."

This bit actually made me snort. Implying that disabled or poor people can't write without AI to help them is far more condescending and ableist/classist than criticising the use of Artificial Idiocy ever could be. In fact, the poor or disabled people NaNoWriMo claims to want to support in their ridiculous statement are also among the most likely to be disadvantaged by the existence of these AI 'tools', because they're far less likely to have the resources (time, energy, money) to fight back when their work is stolen and passed off as the magnum opus of some pretentious wanker who thinks they're going to be the next Hemingway just because they mashed a few buttons in ChatGPT. Frankly, if you can't write stories without using a machine to steal bits of other people's stories for you, that's not a case of "ableism" or "classism". That just means you're not fit to be a writer.

"It's healthy for writers to be curious about what's new and forthcoming, and what might impact their career space or their pursuit of the craft."

At this point, writers who genuinely care about the craft have a pretty solid understanding of how AI might impact their career space. Spoiler: It's not good. I follow many artists and writers on various social media sites, and I have not seen a single positive comment about AI from any of them. It's not just that AI steals the content from the original creators without paying them. As with my students, many people would apparently rather have something crap but fast and easy than put in time and effort or pay for something that is actually worthwhile. AI 'art' is the fast food equivalent of creativity: Sure, you can have it quickly, but it has no value and you'd regret consuming it if you actually thought about it for more than a minute or two. The increase in people turning to AI to pretend to make things for them means the people who actually make the art or write the stories you love aren't getting paid, and if they're not getting paid, the industry is no longer sustainable for them, so they will just stop creating; that means less new content for fans.

And it's not just the financial impact on the creators. True creativity is what differentiates us from machines. The need to make something that evokes feelings and provokes reflection is something that only humans have. The desire to grow and improve and become good at something is what lays out the pathway for a kid scribbling away in their notebook to practice and learn and eventually make something that only they could have made, because it has come from their experiences and their thoughts and their emotions and is, in some way, a window into their soul. If we take the soul out of art, what's the point?

Anyway, it's disappointing that an organisation that used to be a fun and engaging way for writers to communicate with one another has turned into *gestures vaguely at the festering corpse of NaNoWriMo's integrity* whatever this is, but I think it is also now pretty clear that NaNoWriMo is no longer worth your time or money.

I just deleted my NaNoWriMo account (which I should have done years ago but just never got around to it), and I suggest you do as well.

EDIT: Some arguments I frequently see from people trying to justify the use of ChatGPT etc are:

  • AI is going to take people's jobs so they should just accept it and work out how to co-exist with AI instead of being in denial and fear.
  • People already repurpose other people's work and call it inspiration.
My response to these arguments is:

The thing about people taking other people's work as inspiration is that even that reimagining of an idea is still based on that person's own experiences and history, and things that resonate with them. Whereas AI just takes everything. It's like making a soup out of every single ingredient in your cupboard instead of just choosing the few ingredients that actually work well together.

And as far as taking jobs, why are we automating creative jobs when there are multitudes of people willing and able to create good art/stories etc instead of automating the boring and tedious parts of jobs (or life in general) that no one wants to do (which would actually give people more time to enjoy life and do the things that matter)? The people coming up with this AI tech think they're entitled to other people's creative output and that it's okay to just take it because they don't see it as having any value, which I guess is why they consider the generic crap it spits out as "good enough".

While AI might theoretically improve to the point it becomes good at doing things, the fact it's not there yet but people are using it anyway is why I feel it's not a good thing for advancement. AI doesn't seem picky about what it uses, so the more crap AI puts out, the more crap AI will consume when it's trying to generate new content. Too many people seem happy to just accept the junk ChatGPT spits out so they're just going to start becoming reliant on it instead of actually thinking critically about things and figuring things out for themselves and actually finding creative solutions for things, which is basically how humanity got to where we are now.

On a side note, even if AI somehow did magically become useful, is the cost actually worth it?

Losing the plot (or at least untangling it a bit)

Over the last month or so, I haven't done as much writing on my actual manuscript as I would have liked thanks to assignment marking dea...