Apple Watch Camera Remote + AirPods Pro 3

In our ongoing effort to “die broke” I ordered the new Apple Watch 11 and a set of AirPods Pro 3. As I wait for these new toys to arrive, I’ve started playing with the Camera Remote feature on the Apple Watch.
The idea is I can take a photo or video while the phone is across the room (or wherever). In the photo mode it captures a burst of images… and video works as you’d expect.

The AirPods Pro 3 has a number of cool new features, including recording audio from the AirPods instead of the iPhone. In this video (:13) it’s obvious the audio is being recorded from across the room. The new AirPods will also allow me to start and stop the video recording.

 

Apple AirPods: New Health Features

The upcoming Apple AirPods, particularly the anticipated AirPods Pro 3, are expected to bring a major expansion in health features, turning them into more than just audio devices. Here’s what to expect based on the latest information and insider reports:

  • Heart Rate Monitoring: AirPods Pro 3 are rumored to include sensors capable of monitoring your heart rate from inside your ear canal. This will provide an additional method of capturing health data, complementing what’s currently done by the Apple Watch.
  • In-Ear Temperature Sensing: Apple is reportedly developing in-ear temperature sensors, which can deliver more accurate fever detection and overall body temperature measurement compared to wrist-based sensors. This could be particularly useful for early illness detection and general wellness tracking.
  • Hearing Health Features (Already in AirPods Pro 2): AirPods Pro 2 already offer a clinically-validated hearing test and an FDA-cleared over-the-counter hearing aid mode, making them the world’s first all-in-one hearing health device. The Pro 3 are expected to continue and possibly expand on these features, integrating more hearing health tools and diagnostics.
  • Potential Blood Oxygen and Stress Sensors: Future AirPods models may add even more sensors to monitor blood oxygen levels and provide stress indicators, though these features are reportedly in earlier stages of development[1].
  • Live Translation: Upcoming AirPods models are also expected to receive live translation functionality via software updates, leveraging the device’s microphones and on-device AI to provide real-time language translation during conversations.
  • Broader Health Integration: Apple aims for AirPods Pro 3 to become part of a comprehensive health monitoring platform. Health data from the AirPods may eventually integrate with Apple Health and even electronic medical records, enabling more meaningful health insights and possibly even preventive health alerts powered by AI[.
  • Infrared Camera (Rumored): Some rumors suggest a delay in launch is due to production challenges related to a new infrared camera system in AirPods Pro 3, intended to power advanced health features, though this remains unconfirmed and may delay the release to 2026.

Summary

In short: AirPods Pro 3 are set to introduce heart rate and temperature monitoring, enhanced hearing health tools, real-time translation, and the groundwork for even more advanced health sensing in future models. These additions could make AirPods a central device for daily wellness and medical monitoring, blurring the lines between headphones and health wearables.

The AI-powered browser

OpenAI is reportedly planning to launch its own AI-powered browser in the coming weeks. How, I wondered, would it differ from traditional browsers? As I so often do these days, I went to ChatGPT for an answer. You can read the full thread here but let me see if I can hit a few of the high points.

You’ll be able to chat directly with the browser, much like using ChatGPT. Ask a question or give a command and it interacts on your behalf—summarizing information, fetching answers, and even completing tasks—all within the chat interface. Filling forms, booking flights or making reservations, for example. Continue reading

AI does my searching these days

For years, websites were designed to present help: FAQ pages; Customer forums; Step-by-step tutorials; Support articles. But now? Those are often incomplete, out-of-date, buried under SEO sludge, or shaped more to reduce support costs than to actually help.

When I have a software question –even for an app that I’m familiar with– I go straight to an AI assistant. It’s changing how the web works (PDF)

What might AI do for Apple apps

Following is a list of Apple apps that I use on a regular basis: Files, Notes, Mail, Photos, Numbers, Reminders, Weather, Maps, Contacts, Music, Journal, iMovie, Health, Calculator, and Voice Memos.

My prompt to ChatGPT: Assuming Apple develops and/or acquires state-of-the-art AI to run on all Apple devices… and every Apple app is able to access and understand the data contained in all of the other apps… give me five useful scenarios that AI will make possible. Continue reading

Seeing AirPods everywhere

Seems like I’m seeing more and more Apple Watches and AirPods these days. Might be because I recently got an Apple Watch and have my AirPods in most of the time. 

I asked this gentleman how he liked his AirPods and he said he loved ‘em! Which got me thinking about tech preferences by age groups.

While exact figures for iPhone purchases by those over 60 are scarce, the data suggests that approximately 23% to 49% of older adults use iPhones, depending on the specific age range and data source. This reflects a notable presence of iPhone users in this demographic, though Android holds a slight edge among older generations.

Among wearable device owners, the Apple Watch is the most-owned, with 44.14% of respondents in a study indicating they owned one. Health monitoring features, such as heart rate tracking and fall detection, are particularly appealing to this demographic.

Couldn’t find any data on AirPods by demo.

What might an AI-first browser look like?

ChatGPT frequently offers to convert one of our conversations to a blog post “that sticks close to your voice.” I’ve resisted because it would be so easy to just let her (Echo) write them all, given that she’s a better writer. But I simply could not improve on what she wrote for this chat.


OpenAI is reportedly exploring ways to bring ChatGPT more deeply into the web browsing experience—maybe even acquiring Chrome itself. The phrase that caught my eye was “creating an AI-first browser.” That got me wondering: what would that actually look like? Continue reading

“Digital Intuition”

Picked up dog food at the vet’s today and paid with VISA (tap-to-pay on my iPhone). When I got home I began making an entry in the Calendar app on my iPhone. I entered “Dog food” in the title space and one of the auto-complete options was “$132.00″… the amount I charged at the vet’s. How did the Calendar app know that was the amount? GPT 4o offered this “likely explanation:”


GPT: That is a little uncanny, Steve — and you’re not imagining it. Continue reading