Keep Two Thoughts

Personal essays


Noticing - Essay from Newsletter 311

I want my watch and phone to look at me now and then

No, you can’t have coffee

One of my favorite Steve Jobs memories was of him on stage at JavaOne.

JavaOne used to be a huge part of my life. It was a great tech conference thrown by Sun Microsystems and so many of the cool and new ideas seemed to be somehow related to this new language.

Jobs was welcomed on stage by Scott McNealy who talked about Steve as a hero of his and so many of those working in Silicon Valley. Scott said something about it was great that Sun and Apple were talking and cooperating and asking what took so long. I remembered Steve saying something like, “well you were busy putting Java in light bulbs and we didn’t see where we fit in.”

The internet says that my memory is wrong. It says that Steve said that in an interview and not on stage. I remember it that way - but my memory has been faulty before.

Anyway, we’ve had smart lightbulbs for years. Now everyone is putting code everywhere.

One of my favorite Java projects was JINI which allowed mobile code.

In the old days, connecting to a printer was a complicated task. Now the devices find each other and you send a document to a printer anywhere on the network without thinking twice.

JINI had this notion that you would look for printers and they might be able to tell you if they could do more - print in color, send a fax, or print two-sided.

I was more interested in the broader implications of your devices being able to talk to each other.

I had a fear that I would ask for coffee and my coffee maker would check with my refrigerator because it knows that, at the time, I liked milk in my coffee. If there wasn’t enough milk, the coffee maker wouldn’t bother brewing me a cup.

I suppose now it would call out to some food delivery service for the milk and wait for it to arrive before brewing my cup.

This was more than twenty years ago and there was no AI involved. This was just a combination of simple algorithms.

Putting things together

I go to Yoga class several times a week in the same place at the same time.

If it’s 11:15 on a Thursday and I’m at LifeTime fitness, I’m there for Yoga.

I’ve also taken the time to enter the class into my calendar.

My watch knows where I am. It knows what time it is. It knows what’s on my calendar.

Why doesn’t it ask me just before 11:15 if I’m about to start my Yoga class and offer to record the exercise?

Instead, I have to tap on the fitness button and scroll down to the yoga workout and start it manually.

Every Thursday at 11:15.

Class ends at noon.

There are weeks when I forget to stop my workout and a three hour yoga workout is recorded.

Why can’t my phone and my watch work it out?

This was the promise Apple gave us as it introduced shortcuts and location awareness and other technologies.

Your phone or watch would learn from your habits and suggest the right things. It would know where you are and what you did when you were there last at about this time.

You could build in little automations to make your day simpler. Ordering coffee on your way in to work in the morning so you could pick it up at the drive through. Listening to your favorite news source in the car, fetching the weather before you start out, and checking your day to see how busy you expected to be.

This is not AI - it’s just pattern recognition.

If you could see my calendar and knew that I was at LifeTime at 11:15 on a Thursday, you’d know I was in Yoga class. If you could figure that out, why can’t my watch and phone?

Patterns

I’m in Amsterdam this week and doing a fair amount of walking. I always walk a lot when I’m traveling.

When I return home my watch will let me know at some point that I’m not walking as much as I had been doing.

My watch knows where I am. When I pull up Maps for directions it could offer me walking directions or transit directions instead of driving directions - I never drive when I’m traveling.

I think this is part of why I push back on some aspects of AI.

We can do so much of what we need without AI.

We enter so much of our lives into our devices and they are aware of time and place and have access to our schedule and so much more.

They could be so much more helpful than they are without any bit of my personal information leaving my phone to ask an online LLM for advice that it will get wrong a good percentage of the time at an unnecessary cost.

I’m not asking for the flying cars I just want my devices to notice me.

Why can’t my watch and phone notice it’s 11:15 on a Thursday and see that I’m at LifeTime and, after checking my calendar, at least prompt me to record my Yoga class. And then at noon prompt me again to end the workout.


Essay from Dim Sum Thinking Newsletter 311. Read the rest of the Newsletter or subscribe


See also Dim Sum Thinking — Theme by @mattgraham — Subscribe with RSS