sigmoid.social is one of the many independent Mastodon servers you can use to participate in the fediverse.
A social space for people researching, working with, or just interested in AI!

Server stats:

579
active users

#AIWriting

0 posts0 participants0 posts today

⚠️ New essay just dropped:

🐝 Language Used To Mean Something

A raw exploration of language, reality, and mental recursion. ADHD, AI, Derrida, Wittgenstein, and a brain full of buzzing words.

Language used to refer. Now it infers. I want it to mean again.

Read here: wittgensteinsmonster.substack.

Eshu’s Substack · Language Used to Mean SomethingBy Eshu Elegbara

How to tell if the article you're reading was written by AI. Lisa Larson-Kelley for @FastCompany writes 5 easy steps you can use to help spot AI's "emotionally forgettable" writing.

We love her idea of a support group for the em dash lovers out there!

flip.it/LllTek

Fast Company · How to tell if the article you’re reading was written by AIYou’ve noticed that articles are all starting to sound the same. Here are five reasons why
#AI#AIWriting#Tech

A study published in Scientific Reports in 2024 claims that "AI systems emit between 130 and 1500 times less CO2e per page of text generated compared to human writers, while AI illustration systems emit between 310 and 2900 times less CO2e per image than their human counterparts." 👀

Huge if true.

Here's the kicker: "For the human writing process, we looked at humans’ total annual carbon footprints, and then took a subset of that annual footprint based on how much time they spent writing." 🤔

Of course, writing contributes to carbon footprints in the same way as all other human activities like *checks notes* heavy industry, transport, agriculture, and energy and heating. /s 🙄

Last author Andrew W. Torrance declares holding shares in NVIDIA. 🤦

nature.com/articles/s41598-024

All credit for these insights goes to Higher Ed discussions of AI writing & use facebook.com/groups/6329308355

NatureThe carbon emissions of writing and illustrating are lower for AI than for humans - Scientific ReportsAs AI systems proliferate, their greenhouse gas emissions are an increasingly important concern for human societies. In this article, we present a comparative analysis of the carbon emissions associated with AI systems (ChatGPT, BLOOM, DALL-E2, Midjourney) and human individuals performing equivalent writing and illustrating tasks. Our findings reveal that AI systems emit between 130 and 1500 times less CO2e per page of text generated compared to human writers, while AI illustration systems emit between 310 and 2900 times less CO2e per image than their human counterparts. Emissions analyses do not account for social impacts such as professional displacement, legality, and rebound effects. In addition, AI is not a substitute for all human tasks. Nevertheless, at present, the use of AI holds the potential to carry out several major activities at much lower emission levels than can humans.
Replied in thread

Writing a nonfiction book with AI is vibecoding’s final boss. Endless paragraphs that begin with “In today’s fast-paced world,” followed by takes so tepid they could chill hot tea. But sure, let’s pretend GPT has a spine. #AIwriting 🧠📚

The cognitive debt of using AI to write essays

I am currently reading MIT’s research paper on the “cognitive debt” you can incur when when using ChatGPT: Your Brain on ChatGPT: Accumulation
of Cognitive Debt when Using an AI Assistant for Essay Writing Task
.

You can find the link to the paper on MIT’s site.

The paragraph below caught my attention:

“This suggests that rewriting an essay using AI tools (after prior AI-free writing) engaged more extensive brain network interactions. In contrast, the LLM-to-Brain group, being exposed to LLM use prior, demonstrated less coordinated neural effort in most bands, as well as bias in LLM specific vocabulary.”

As a professional writer who loves to research random topics, write about and whose idea of a relaxing time is to tinker with my notes in my Obsidian vault and read the latest nerdy theories about personal knowledge management, I’m very protective of my cognitive abilities.

The fact that using tech like AI assistants (aka ChatGPT) can have an impact on our physical brains alarms me quite a bit, especially in the age when “iPad kids” are a thing. We’re only starting to understand the impact of these devices on young brains, let alone (Large Language Models) LLMs.

The paper seem to suggest that using ChatGPT or any other LLMs to generate the essay first, then improve on it, is bad for your brain.

So, I’m glad I’m not using ChatGPT first before writing my essays.

I’ve always resisted this.

For one, I feel that the output “influences” my writing, so I refuse to ask AI to generate any copy first lest I be influenced to write like AI! (I’m the type of writer who can read someone else’s writing and unconsciously adopt their style.)

Instead, I’ve used this technique learned from my journalism days and when I was writing novels actively: Write a very rough, shitty first draft as fast as I can (I even have a 30-minute timer for this). Then, beautify the prose. Both are human activities.

I only use AI when I hit a “wall” or a “writer’s block”. I usually ask it for suggestions to improve sentence construction. Usually titles, which I admittedly need a lot of help for SEO reasons. I rarely, if ever, use the suggested copy wholesale, but rewrite it.

That said, besides the environmental impacts, we now need to consider the physical impacts on us before we use AI to write essays. And maybe, even from brainstorming, because apparently, if we use AI for advice, explanations or ideas, it can foster dependency?

Alas, I have to admit I love using AI for this use case. It’s like having conversations with another nerd about silly subjects and I can go down rabbit holes that way.

I have to admit, I like using AI to clarify my thoughts about decisions I’ve made, and that is a tad too soothing for me!

To clarify, however, AI isn’t the first person I turn to when I want advice, but I need to remember to reach out to humans first before AI, and not replace human advice with AI! I can see how, once I get too comfortable, I forget to do just this, and I can get dependent.

Recently, I shared a post on Mastodon about a brainstorming technique I stumbled on in Youtube, and was surprised by the pushback I received.

https://www.youtube.com/watch?v=p63MKDEsuFc&feature=youtu.be

I, too, use AI to “stay in the subject” when exploring ideas. I personally think this is a healthy and productive way to use AI that won’t, well, damage your brain.

That said, I hope all of us remember that not only does using AI incur cognitive debt, it’s really damaging to the environment. We should be conscious that using AI to generate an image of a dog flying in space for larks impacts the environment:

Each time a model is used, perhaps by an individual asking ChatGPT to summarize an email, the computing hardware that performs those operations consumes energy. Researchers have estimated that a ChatGPT query consumes about five times more electricity than a simple web search. – Explained: Generative AI’s environmental impacts (MIT)

I hope, moving forward, that we writers can use AI in a way that is good for our brains and our environment.

That means not using AI for everything. A little challenging these days when the AI button option is everywhere, I have to admit. They are like little red buttons, enticing us to push them.

@artificialintelligencenews.in 👋 Hello @artificialintelligencenews.in!

We’re Author AI, the first AI built specifically for long-form writing. Think novels, screenplays, technical manuals, but with structure at scale.

We’re not here to replace the human author. We’re here to support those with powerful stories, but limited time, resources, or access to craft training.

We'd love to share some of our work with you for your review!