“It’s safe to say that the people who volunteered to “shape” the initiative want it dead and buried. Of the 52 responses at the time of writing, all rejected the idea and asked Mozilla to stop shoving AI features into Firefox.”

  • golden_zealot@lemmy.ml
    shield
    M
    link
    fedilink
    English
    arrow-up
    22
    arrow-down
    5
    ·
    5 days ago

    Hey all, just a reminder to keep the community rules in mind when commenting on this thread. Criticism in any direction is fine, but please maintain your civility and don’t stoop to ad-hominem etc. Thanks.

    • Wooki@lemmy.world
      link
      fedilink
      arrow-up
      4
      arrow-down
      1
      ·
      5 days ago

      don’t stoop to ad-hominem

      At this point Ad-hominem is practically the nice name for the business model “enshitification”.

  • balsoft@lemmy.ml
    link
    fedilink
    arrow-up
    33
    ·
    edit-2
    6 days ago

    You want AI in your browser? Just add <your favourite spying ad machine> as a “search engine” option, with a URL like

    https://chatgpt.com/?q=%25s
    

    , with a shortcut like @ai. You can then ask it anything right there in your search bar.

    Maybe also add one with a URL with some query pre-written like

    https://chatgpt.com/?q=summarize this page for me: %s
    

    as @ais or something, modern chatbots have the ability to make HTTP requests for you. Then if you want to summarize the page you’re on, you do Ctrl+L Ctrl+C @ais Ctrl+V Enter. There, I solved all your AI needs with 4 shortcuts without literally any client-side code.

      • Leon@pawb.social
        link
        fedilink
        English
        arrow-up
        1
        ·
        5 days ago

        Emissions. Economic problems. Goading several people into suicide.

        Like, if you ship a food item with harmful bacteria in it, it gets recalled. If you have a fatal design flaw in a car, it gets recalled. If your LLM encourages people to separate from their loved ones and kill themselves, nothing fucking happens.

  • Hirom@beehaw.org
    link
    fedilink
    arrow-up
    23
    arrow-down
    1
    ·
    edit-2
    5 days ago

    The more AI is being pushed into my face, the more it pisses me off.

    Mozilla could have made an extension and promote it on their extension store. Rather than adding cruft to their browser and turning it on by default.

    The list of things to turn off to get a pleasant experience in Firefox is getting longer by the day. Not as bad as chrome, but still.

    • lazynooblet@lazysoci.al
      link
      fedilink
      English
      arrow-up
      9
      arrow-down
      1
      ·
      5 days ago

      Oh this triggers me. There have been multiple good suggestions for Firefox in the past that are closed with nofix as “this can be provided by the community as an add-on”. Yet they shove the crappiest crap into the main browser now.

    • pory@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      5 days ago

      Switching to de-Mozilla’d Firefox (Waterfox) is as simple as copying your profile folder from FF to WF. Everything transfers over, and I mean everything. No mozilla corp, no opting out of shit in menus at all.

  • railway692@piefed.zip
    link
    fedilink
    English
    arrow-up
    27
    arrow-down
    1
    ·
    6 days ago

    Those unhappy have another option: use an AI‑free Firefox fork such as LibreWolf, Waterfox, or Zen Browser.

    And I have taken that other option.

    Also: Vanadium and/or Ironfox on Android.

    • hitmyspot@aussie.zone
      link
      fedilink
      arrow-up
      10
      arrow-down
      3
      ·
      6 days ago

      A fork is great, but the more a fork deviates, the more issues there are likely to be. Firefox is already at low enough numbers that it’s not really sustainable.

      • ashx64@lemmy.world
        link
        fedilink
        arrow-up
        1
        ·
        6 days ago

        The truth is that Chromium is really good. It has the best security and performance.

        Vanadium takes that and makes changes to make it more secure and private.

        • Leon@pawb.social
          link
          fedilink
          English
          arrow-up
          1
          ·
          5 days ago

          I think the problem with Chromium isn’t so much that Blink or V8 is bad or anything, it’s that it’s entirely under the thumb of Google. We’re essentially being set up for another Internet Explorer scenario, only Google unlike Microsoft won’t just be sitting on their laurels. Google is an advertising company, their entire business model is the web. Google Search is the tool used to find things, and with Google Chrome being the go-to browser for a lot of people, Google essentially ends up in control of both how you access the web and what you access.

          That sort of power is scary, which is why I personally avoid anything Chromium based as much as I am able to. Chromium itself is fantastic, but I don’t like what it represents.

          • ashx64@lemmy.world
            link
            fedilink
            arrow-up
            2
            ·
            5 days ago

            That’s valid.

            That’s also part of the reason I like Webkit. It’s in a nice spot between Firefox and Chromium when it comes to security and performance. And importantly, is not from an ad company and often passes on browser specs that would be harmful to privacy and security.

            I forget what the site is called, but I saw one that nicely layed out different browser specs and gives the explanation why one of the engine developers decided against supporting or implementing it.

            • Leon@pawb.social
              link
              fedilink
              English
              arrow-up
              1
              ·
              5 days ago

              Gods I wish Epiphany/Gnome Web was better. The Kagi people are working on bringing Orion to Linux, which I believe will be using WebKit there as well.

              It’s kind of funny that we don’t have a solid WebKit browser on Linux, since WebKit has its roots in the KDE Projects KHTML engine for Konqueror.

              I guess that kind of ties in to my anger at these massive tech companies profiting off of FOSS but doing almost fuck-all to contribute. Google opening LLM generated bug reports in FFMPEG when all of the streaming media giants are propped up by this one project is just one example. There should be some kind of tax for this, I feel. They’re benefitting greatly, and provide nothing in return.

        • TrickDacy@lemmy.world
          link
          fedilink
          arrow-up
          5
          ·
          6 days ago

          Wrong. You are both popularizing Google tech and decreasing web browser diversity when you use any chromium variety

          • TheOneCurly@feddit.online
            link
            fedilink
            English
            arrow-up
            1
            ·
            5 days ago

            Vandium is all about not standing out from the crowd. You use it to not make a statement and hide your activity within the majority of useragents. If you want to make a statement that’s great, but you should only do it when you’re ok being fingerprinted.

              • TheOneCurly@feddit.online
                link
                fedilink
                English
                arrow-up
                1
                ·
                5 days ago

                I didn’t mean that in a negative way. All I meant was that using a non-chromium browser to help move the needle is a privacy tradeoff. I keep both vandium and ironfox installed and use them at different times for different things.

            • TrickDacy@lemmy.world
              link
              fedilink
              arrow-up
              6
              ·
              6 days ago

              Are you serious? Chromium is very much mostly written by Google and the direction it takes in every way that matters is entirely controlled by Google.

                • TrickDacy@lemmy.world
                  link
                  fedilink
                  arrow-up
                  4
                  ·
                  5 days ago

                  It actually does. You’re still supporting a browser monoculture unless you change it so radically that it makes no sense to call it a fork anymore

  • brucethemoose@lemmy.world
    link
    fedilink
    arrow-up
    17
    arrow-down
    1
    ·
    edit-2
    6 days ago

    Hear me out.

    This could actually be cool:

    • If I could, say, mash in “get rid of the junk in this page” or “turn the page this color” or “navigate this form for me”

    • If it could block SEO and AI slop from search/pages, including images.

    • If I can pick my own API (including local) and sampling parameters

    • If it doesn’t preload any model in RAM.

    …That’d be neat.

    What I don’t want is a chatbot or summarizer or deep researcher because there are 7000 bajillion of those, and there is literally no advantage to FF baking it in like every other service on the planet.


    And… Honestly, PCs are not ready for local LLMs. Not even the most hyoper optimized trellis quantization of Qwen3 30B is ‘good enough’ to be reliable for the average person, and it still takes too much CPU RAM.

    • azertyfun@sh.itjust.works
      link
      fedilink
      arrow-up
      1
      ·
      4 days ago

      Honestly, PCs are not ready for local LLMs

      The auto-translation LLM runs locally and works fine. Not quite as good as deepl but perfectly competent. That’s the one “AI” feature which is largely uncontroversial because it’s actually useful, unobtrusive, and privacy-enhancing.

      Local LLMs (and related transformer-based models) can work, they just need a narrow focus. Unfortunately they’re not getting much love because cloud chatbots can generate a lot of incoherent bullshit really quickly and that’s a party trick that’s got all the CEOs creaming their pants at the ungrounded fantasy of being just another trillion dollars away from AGI.

      • brucethemoose@lemmy.world
        link
        fedilink
        arrow-up
        1
        ·
        edit-2
        3 days ago

        Yeah that’s really awesome.

        …But it’s also something the anti-AI crowd would hate once they realize it’s an 'LLM" doing the translation, which is a large part of FF’s userbase. The well has been poisoned by said CEOs.

        • azertyfun@sh.itjust.works
          link
          fedilink
          arrow-up
          2
          ·
          3 days ago

          I don’t think that’s really fair. There are cranky contradictarians everywhere, but in my experience that feature has been well received even in the AI-skeptic tech circles that are well educated on the matter.

          Besides, the technical “concerns” are only the tip of the iceberg. The reality is that people complaining about AI often fall back to those concerns because they can’t articulate how most AI fucking sucks to use. It’s an eldtritch version of clippy. It’s inhuman and creepy in an uncanny valley kind of way, half the time it doesn’t even fucking work right and even if it does it’s less efficient than having a competent person (usually me) do the work.

          Auto translation or live transcription tools are narrowly-focused tools that just work, don’t get in the way, and don’t try to get me to talk to them like they are a person. Who cares whether it’s an LLM. What matters is that it’s a completely different vibe. It’s useful, out of my way when I don’t need it, and isn’t pretending to have a first name. That’s what I want from my computer. And I haven’t seen significant backlash to that sentiment even in very left-wing tech circles.

    • sudo@programming.dev
      link
      fedilink
      arrow-up
      1
      ·
      4 days ago

      If I can pick my own API (including local) and sampling parameters

      You can do this now:

      • selfhost ollama.
      • selfhost openai and point it to ollama
      • enable local models in about:config
      • select “local” instead of ChatGPT or w/e.

      Hardest part is hosting openai because AFAIK it only ships as a docker image.

      • brucethemoose@lemmy.world
        link
        fedilink
        arrow-up
        1
        ·
        edit-2
        3 days ago

        Open WebUI isn’t very ‘open’ and kinda problematic last I saw. Same with ollama; you should absolutely avoid either.

        …And actually, why is open web ui even needed? For an embeddings model or something? All the browser should need is an openai compatible endpoint.

        • sudo@programming.dev
          link
          fedilink
          arrow-up
          2
          ·
          3 days ago

          The firefox AI sidebar embeds an external open-webui. It doesn’t roll its own ui for chat. Everything with AI is done in the quickest laziest way.

          What exactly isn’t very open about open-webui or ollama? Are there some binary blobs or weird copyright licensing? What alternatives are you suggesting?

          • brucethemoose@lemmy.world
            link
            fedilink
            arrow-up
            1
            ·
            edit-2
            2 days ago

            https://old.reddit.com/r/opensource/comments/1kfhkal/open_webui_is_no_longer_open_source/

            https://old.reddit.com/r/LocalLLaMA/comments/1mncrqp/ollama/

            Basically, they’re both using their popularity to push proprietary bits, which their devleopment is shifting to. They’re enshittifying.

            In addition, ollama is just a demanding leech on llama.cpp that contributes nothing back, while hiding the connection to the underlying library at every opportunity. They do scummy things like.

            • Rename models for SEO, like “Deepseek R1” which is really the 7b distill.

            • It has really bad default settings (like a 2K default context limit, and default imatrix free quants) which give local LLM runners bad impressions of the whole ecosystem.

            • They mess with chat templates, and on top of that, create other bugs that don’t exist in base llama.cpp

            • Sometimes, they lag behind GGUF support.

            • And other times, they make thier own sloppy implementations for ‘day 1’ support of trending models. They often work poorly; the support’s just there for SEO. But this also leads to some public GGUFs not working with the underlying llama.cpp library, or working inexplicably bad, polluting the issue tracker of llama.cpp.

            I could go on and on with examples of their drama, but needless to say most everyone in localllama hates them. The base llama.cpp maintainers hate them, and they’re nice devs.

            You should use llama.cpp llama-server as an API endpoint. Or, alternatively the ik_llama.cpp fork, kobold.cpp, or croco.cpp. Or TabbyAPI as an ‘alternate’ GPU focused quantized runtime. Or SGLang if you just batch small models. Llamacpp-python, LMStudo; literally anything but ollama.

            As for the UI, thats a muddier answer and totally depends what you use LLMs for. I use mikupad for its ‘raw’ notebook mode and logit displays, but there are many options. Llama.cpp has a pretty nice built in one now.

    • guismo@aussie.zone
      link
      fedilink
      English
      arrow-up
      0
      ·
      5 days ago

      That would be awesome. Like a greasemonkey/advanced unlock for those of us who don’t know how to code. So many times I wanted to customise a website but I don’t know how or it’s not worth the effort.

      But only of it was local, and specially on mobile, where I need the most, it will be impossible for years…

      • brucethemoose@lemmy.world
        link
        fedilink
        arrow-up
        1
        ·
        edit-2
        4 days ago

        I mean, you can run small models on mobile now, but they’re mostly good as a cog in an automation pipeline, not at (say) interpreting english instructions on how to alter a webpage.

        …Honestly, open weight model APIs for single-off calls like this is not a bad stopgap. It’s literally pennies, and power efficient.

        • guismo@aussie.zone
          link
          fedilink
          English
          arrow-up
          0
          ·
          4 days ago

          You mean to use online LLM?

          No. That’s what I don’t want. If it was a company I trusted I would, but good luck with that. Mozilla is not that company anymore, even if they had the resources to host their own.

          But locally or in a server I trust? That would be awesome. AI is awesome, but not the people who runs it.

          • brucethemoose@lemmy.world
            link
            fedilink
            arrow-up
            1
            ·
            edit-2
            4 days ago

            I mean, there are literally hundreds of API providers. I’d probably pick Cerebras, but you can take your pick from any jurisdiction and any privacy policy.

            I guess you could rent an on-demand cloud instance yourself too, that spins down when you aren’t using it.

          • cassandrafatigue@lemmy.dbzer0.com
            link
            fedilink
            arrow-up
            0
            arrow-down
            2
            ·
            4 days ago

            Your server has not a monopoly on, but a majority of the worst shitlibs and other chuds. To the point I’m genuinely surprised by agreeing with someone there, and am worried that when i examine it closely youll be agreeing with me for some unthinkably horrible reason.

            • Professorozone@lemmy.world
              link
              fedilink
              arrow-up
              1
              ·
              edit-2
              4 days ago

              The problem is I fundamentally do not understand how Lemmy works, so I just picked what seemed obvious. Like why wouldn’t I want the world.

              Also I thought from just reading sub-Lemmies? that .ml was the crap hole.

              Also, I looked up Chud and that’s really mean.

              • golden_zealot@lemmy.mlM
                link
                fedilink
                English
                arrow-up
                1
                ·
                3 days ago

                I would say that while there are general rules of thumb, it’s generally good to never assume the intentions or beliefs of another user based solely on their home server. There are nice people all over, and there are also a lot of assholes all over.

                By the way, as to your question mark, they are just called “Communities” on Lemmy typically, though I think some instances call them something different occasionally.

  • voodooattack@lemmy.world
    link
    fedilink
    arrow-up
    11
    ·
    5 days ago

    Why not just distribute a separate build and call it “Firefox AI Edition” or something? Making this available in the base binary is a big mistake. At least doing so immediately and without testing the waters.

    • Leon@pawb.social
      link
      fedilink
      English
      arrow-up
      1
      ·
      5 days ago

      There is a Firefox Developer’s Edition so I don’t see why not? I personally don’t care to see them waste the time on AI features.

  • PearOfJudes@lemmy.ml
    link
    fedilink
    arrow-up
    7
    ·
    5 days ago

    I think Mozilla’s base is privacy focused individuals, a lot of them appreciating firefox’s opensource nature and the privacy hardened firefox forks. From a PR perspective, Firefox will gain users by adamantly going against AI tech.

    • JackbyDev@programming.dev
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      2
      ·
      5 days ago

      Maybe their thought process is they’ll gain more users by adopting AI while knowing they’re still the most privacy focused of the major browsers. Where have I seen this mentality before?

      Spoiler

      The American Democrat party often believes it can get more votes by shifting conservative, believing the more progressive voters will stick pick them because they’re still more progressive than not.

  • 1984@lemmy.today
    link
    fedilink
    arrow-up
    8
    ·
    6 days ago

    I think ive lost hope at this point to see AI being actually useful in any application except chat gpt and code editors.

    Companies are struggling how to use Ai in their products because it actually doesnt improve their product, but they really really want it to.

  • m-p{3}@lemmy.ca
    link
    fedilink
    arrow-up
    4
    arrow-down
    2
    ·
    4 days ago

    It depends. If it’s just for the sake of plugging AI because it’s cool and trendy, fuck no.

    If it’s to improve privacy, accessibility and minimize our dependency on big tech, then I think it’s a good idea.

    A good example of AI in Firefox is the Translate feature (Project Bergamot). It works entirely locally, but relies on trained models to provide translation on-demand, without having Google, etc as the middle-man, and Mozilla has no idea what you translates, just which language model(s) you downloaded.

    Another example is local alt-text generation for images, which also requires a trained model. Again, works entirely locally, and provide some accessibility to users with a vision impairment when an image doesn’t provide caption.

  • FoundFootFootage78@lemmy.ml
    link
    fedilink
    English
    arrow-up
    8
    arrow-down
    2
    ·
    6 days ago

    I considered using AI to summarize news articles that don’t seem worth the time to read in full (the attention industrial complex is really complicating my existence). But I turned it off and couldn’t find the button to turn it back on.

    • fodor@lemmy.zip
      link
      fedilink
      arrow-up
      12
      arrow-down
      1
      ·
      6 days ago

      If you need to summarize the news, which is already a summary of an event containing the important points and nothing else, then AI is the wrong tool. A better journalist is what you actually need. The whole point of good journalism is that it already did that work for you.

      • FoundFootFootage78@lemmy.ml
        link
        fedilink
        English
        arrow-up
        4
        arrow-down
        1
        ·
        6 days ago

        I have a real journalist, but this is more on the “did you know this was important” side. Like how it’s fine to rinse your mouth out after brushing your teeth, but if your water isn’t fluoridated then you probably shouldn’t (which I got from skimming the article for the actionable information).

    • rozodru@pie.andmc.ca
      link
      fedilink
      English
      arrow-up
      1
      ·
      6 days ago

      you have to be REALLY careful when asking an LLM to summarize news otherwise it will hallucinate what it believes sounds logical and correct. you have to point it directly to the article, ensure that it reads it, and then summarize. and honestly at that point…you might as well read it yourself.

      And this goes beyond just summarizing articles you NEED to provide an LLM a source for just about everything now. Even if you tell it to research online the solution to a problem many times now it’ll search for non-relevant links and utilize that for its solution because, again, to the LLM it makes the most sense when in reality it has nothing to do with your problem.

      At this point it’s an absolute waste of time using any LLM because within the last few months all models have noticeably gotten worse. Claude.ai is an absolute waste of time as 8 times out of 10 all solutions are hallucinations and recently GPT5 has started “fluffing” solutions with non-relevant information or it info dumps things that have nothing to do with your original prompt.

  • katy ✨@piefed.blahaj.zone
    link
    fedilink
    English
    arrow-up
    5
    arrow-down
    1
    ·
    edit-2
    6 days ago

    ai can be good as long as you don’t let it think for you. i think the problem is taking resources from development and building into a browser would could just be a bookmark to a webpage.

    why don’t they just instead put vivaldi’s web panel sidebar into firefox so you can just add chatgpt or whatever as a web panel. i think that would be infinitely more useful (and can be used for other sites other than ai assistants).

    • Tangentism@lemmy.ml
      link
      fedilink
      arrow-up
      4
      ·
      6 days ago

      ai can be good as long as you don’t let it think for you

      Unfortunately, there’s too many people already doing that, with not so clever results!

      If it increases accessibility for those with additional requirements then great but we know that’s not even in its top 10 reasons for being implemented