Midjourney/Every illustration.

I Started Talking to My Computer Instead of Typing. It Changed How I Think.

Voice-first AI removed the invisible friction between my brain and the page

32 4

Comments

You need to login before you can comment.
Don't have an account? Sign up!
Lorin Ricker 1 day ago

Many of Every's software products seem to be aimed/limited to the Mac. Will Monologue be the same, or is it a web-based service, thus available on Linux workstations?

@hdemott3 1 day ago

I sent you a note on X - not seeing this here. anyway - enjoyed the piece. I have a Chat GPT window open constantly called Human IO interface. It is an exploration of how we process information visiually far faster than any other way - but we can get information out faster through speech than any other method available (at least til Elon gets that Neurolink working). seems like an inefficient system - like we should be able to use all that visual processing power to project information - but here we are talking to our computers and listening to podcasts

Oshyan Greene 1 day ago

I'm interested in your thesis and experience here, but it seems a little too entangled with what your specific AI projects and transformative aspects of AI can do. In other words it feels as much about the setups and dialog practices you have with AI as it does about "thinking/talking out loud" and using realtime transcription. Obviously for you, you're already used to what your existing projects helped you do, so the transformation you're experiencing is real, but how translatable is it, and perhaps more precisely, how much of its value is in the *combination* of realtime transcription and well-setup projects/contexts?

I suppose one way to disentangle that a bit might be to be trying it with something a little less formalized or refined in terms of context. It may be a little bit of an academic distinction, but to me it feels like a way of orienting toward where the real gains are. I find the idea of free-form talk and immediate transcription to be a bit messy and imprecise, and the implication is that AI will help make up for that, but how much setup/providing of context does that require? Or does it work well "out of the box" with a good general purpose LLM already? I suppose I can just try it 😄

JD Deitch about 22 hours ago

@Oshyan I think you are right - this project is about execution. My read is that Katie's experience is that typing substantively slows us down, perhaps even derails us. The goal is therefore to not just create greater interface fluidity, but thinking fluidity as well. In the background, she still has all the contextual elements of her work (style sheets, execution sequences etc)