As a medical doctor I extensively use digital voice recorders to document my work. My secretary does the transcription. As a cost saving measure the process is soon intended to be replaced by AI-powered transcription, trained on each doctor’s voice. As I understand it the model created is not being stored locally and I have no control over it what so ever.

I see many dangers as the data model is trained on biometric data and possibly could be used to recreate my voice. Of course I understand that there probably are other recordings on the Internet of me, enough to recreate my voice, but that’s beside the point. Also the question is about educating them, not a legal one.

How do I present my case? I’m not willing to use a non local AI transcribing my voice. I don’t want to be percieved as a paranoid nut case. Preferravly I want my bosses and collegues to understand the privacy concerns and dangers of using a “cloud sollution”. Unfortunately thay are totally ignorant to the field of technology and the explanation/examples need to translate to the lay person.

  • FlappyBubble@lemmy.mlOP
    link
    fedilink
    arrow-up
    4
    ·
    10 months ago

    Sure but what about my peers? I want to get the point across and the understanding of privacy implications. I’m certain that this is just the first of many reforms without proper analysis of privacy implications.

    • Boozilla@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      10 months ago

      I agree that getting the point across and having them rethink this whole thing is a much better way of handling this than using a tech solution. I am just pessimistic you can change their minds and you might need a plan B.