• Endmaker@ani.social
    link
    fedilink
    English
    arrow-up
    7
    ·
    edit-2
    4 days ago

    If it has to be live (i.e. works in real-time), doing the translation in a server elsewhere is probably a bad idea due to the latency.

    It makes more sense to run an optimised translation model locally - ideally in the AirPods directly but if not, on the connected iPhone.

    • WhatAmLemmy@lemmy.world
      link
      fedilink
      English
      arrow-up
      12
      ·
      4 days ago

      There’s no way airpods have the power or compute to do that (anytime soon). It’ll be via phone or mac.

      • baggachipz@sh.itjust.works
        link
        fedilink
        arrow-up
        2
        ·
        4 days ago

        They’ll probably use it as a way to sell the iPhone 17, claiming previous iPhones don’t have the necessary computing power, but the new processor does!!