Alexa, why have you charged me ÂŁ2 to say the Hail Mary?

  • I don't think this particular incident is any kind of big deal, though it's funny. It was only a ÂŁ1.99 a month and there was a 7-day free trial anyway. Amazon say they emailed about it as confirmation. I think the issue it shows is how you can't trust voice control for anything important. It's just a bad interface for buying stuff particularly. If I am buying some dog food, I search dog food, glance over the options, check prices and delivery dates and reviews then plump for one. Thinking of doing the same by voice drains me.

    "Alexa, order me the cheapest best reviewed dry dog food that will arrive tomorrow". Could that work? Would I trust that it had worked, if Alexa just said "ok" without me running off to a computer to check the order details and defeating the whole object? More likely it would turn into an exhausting game of twenty questions with the device narrowing my selection iteratively.

    (I actually tried that sentence just now. Alexa remained in a stunned silence.)

  • Short version: Alexa supports third parties extending it with "Skills" apps which may be commercial. Paid functionality may be enabled via verbal confirmation via an in-app purchase[1]. Paid functionality may be one-time or recurring.

    Her sister caught the corresponding email just before the paid subscription would have started. The documentation says developers must mark skills targeted at kids and those aren't eligible to have this flow enabled.

    [1] https://developer.amazon.com/es-MX/docs/alexa/paid-skills/ov...

    edit to add -- that was all presented without comment. But having a device that can start up a recurring subscription if anyone says "yes" to one prompt exist at all, and having the toggle for the feature on by default and in options is in the "no thanks" column for me.

  • I’m more shocked that the developer of a “skill” that says the holy Mary makes $2 a month and has 10,000 users. That’s a sweet return for work that on the scale of copy-pasting the Alexa getting started tutorial.

  • Yeah, Amazon is terrible about this. It claims that it must have had the user confirm the purchase, but I wouldn’t be surprised if it didn’t. I order from Amazon about once a year, and the last time I did, they ran the, “Would you also like to sign up for Prime while making this order?” scam on me. I hit the button and immediately realized what I had done. I hit the back button before the next page loaded, but it was too late. I already had an email saying I had signed up without confirming anything or agreeing to any terms of service. Luckily I only had to go through about 3-5 pages of “Are you really sure you want to cancel?” And “how about we just suspend the service but don’t cancel it,” before I was allowed to cancel it. This company is disgusting.

  • Be smarter than a smart speaker by never purchasing one. Why put an always on snitch in your home? Such convivence is a trap.

  • > set up by my sister... and is attached to her... account

    > can inadvertently enter into premium subscriptions simply by saying yes

    So this individual (sister to a journalist at the Guardian, which facilitates information spreading and highlights the possibility of under the radar cases) gives somebody else a voice controlled machine linked to a credit card. What could possibly.

  • Seems like the email should be the confirmation, and without the confirmation the trial stops. Opting in with a yes alone, when my four year olds can now ask Alexa for stuff, seems like a terrible design choice by itself.

  • Never mind prayers. The new frontier for AI is confessions. And Alexa should sell indulgences.

    Seriously though, how long before we see AI-powered therapists? Or do they exist already?

  • Anything I don’t expect to use to make purchases I try to enable whatever “child lock” features it might have.

  • Does God answer prayers from robots?

  • god runs on a freemium service

  • Oh this dystopian future.

  • > Thank goodness she didn’t ask Alexa to say the Rosary

    The highlight of the article there.