Close Menu
TechUpdateAlert

    Subscribe to Updates

    Get the latest creative news from FooBar about art, design and business.

    What's Hot

    My Health Anxiety Means I Won’t Use Apple’s or Samsung’s Smartwatches. Here’s Why

    December 22, 2025

    You can now buy the OnePlus 15 in the US and score free earbuds if you hurry

    December 22, 2025

    Today’s NYT Connections: Sports Edition Hints, Answers for Dec. 22 #455

    December 22, 2025
    Facebook X (Twitter) Instagram
    Trending
    • My Health Anxiety Means I Won’t Use Apple’s or Samsung’s Smartwatches. Here’s Why
    • You can now buy the OnePlus 15 in the US and score free earbuds if you hurry
    • Today’s NYT Connections: Sports Edition Hints, Answers for Dec. 22 #455
    • Android might finally stop making you tap twice for Wi-Fi
    • Today’s NYT Mini Crossword Answers for Dec. 22
    • Waymo’s robotaxis didn’t know what to do when a city’s traffic lights failed
    • Today’s NYT Wordle Hints, Answer and Help for Dec. 22 #1647
    • You Asked: OLED Sunlight, VHS on 4K TVs, and HDMI Control Issues
    Facebook X (Twitter) Instagram Pinterest Vimeo
    TechUpdateAlertTechUpdateAlert
    • Home
    • Gaming
    • Laptops
    • Mobile
    • Software
    • Reviews
    • AI & Tech
    • Gadgets
    • How-To
    TechUpdateAlert
    Home»Gaming»Hackers can hide AI prompt injection attacks in resized images
    Gaming

    Hackers can hide AI prompt injection attacks in resized images

    techupdateadminBy techupdateadminAugust 26, 2025No Comments2 Mins Read
    Share Facebook Twitter Pinterest LinkedIn Tumblr Reddit Telegram Email
    webcam hacker
    Share
    Facebook Twitter LinkedIn Pinterest Email

    “AI” tools are all the rage at the moment, even among users who aren’t all that savvy when it comes to conventional software or security—and that’s opening up all sorts of new opportunities for hackers and others who want to take advantage of them. A new research team has discovered a way to hide prompt injection attacks in uploaded images.

    A prompt injection attack is a way to hide instructions for an LLM or other “artificial intelligence” system, usually somewhere a human operator can’t see them. It’s the whispered “loser-says-what” of computer security. A great example is hiding a phishing attempt in an email in plain text that’s colored the same as the background, knowing that Gemini will summarize the text even though the human recipient can’t read it.

    A two-person Trail of Bits research team discovered that they can also hide these instructions in images, making the text invisible to the human eye but revealed and transcribed by an AI tool when an image is compressed for upload. Compression—and the artifacts that come along with it—are nothing new. But combined with the sudden interest in hiding plain text messages, it creates a new way to get instructions to an LLM without the user knowing those instructions have been sent.

    In the example highlighted by Trail of Bits and BleepingComputer, an image is delivered to a user, the user uploads the image to Gemini (or uses something like Android’s built-in circle-to-search tool), and the hidden text in the image becomes visible as Google’s backend compresses it before it’s “read” to save on bandwidth and processing power. After being compressed, the prompt text is successfully injected, telling Gemini to email the user’s personal calendar information to a third party.

    That’s a lot of legwork to get a relatively small amount of personal data, and both the complete attack method and the image itself need to be tailored to the specific “AI” system that’s being exploited. There’s no evidence that this particular method was known to hackers before now or is being actively exploited at the time of writing. But it illustrates how a relatively innocuous action—like asking an LLM “what is this thing?” with a screenshot—could be turned into an attack vector.

    Attacks Hackers Hide Images injection Prompt resized
    Share. Facebook Twitter Pinterest LinkedIn Tumblr Email
    Previous ArticleEchoStar finalizes the sale of major Dish wireless spectrum to AT&T
    Next Article Best Bird Feeders With Cameras, Tested and Reviewed (2025)
    techupdateadmin
    • Website

    Related Posts

    Mobile

    ChatGPT Images is now faster and better than ever

    December 17, 2025
    Mobile

    SpaceX marks 100th Space Coast launch of 2025 with stunning Falcon 9 images

    December 17, 2025
    Gadgets

    100,000 images in, Mars is still full of surprises

    December 17, 2025
    Add A Comment
    Leave A Reply Cancel Reply

    Top Posts

    NYT Strands hints and answers for Monday, August 11 (game #526)

    August 11, 202545 Views

    These 2 Cities Are Pushing Back on Data Centers. Here’s What They’re Worried About

    September 13, 202542 Views

    Today’s NYT Connections: Sports Edition Hints, Answers for Sept. 4 #346

    September 4, 202540 Views
    Stay In Touch
    • Facebook
    • YouTube
    • TikTok
    • WhatsApp
    • Twitter
    • Instagram
    Latest Reviews

    Subscribe to Updates

    Get the latest tech news from FooBar about tech, design and biz.

    Most Popular

    Best Fitbit fitness trackers and watches in 2025

    July 9, 20250 Views

    There are still 200+ Prime Day 2025 deals you can get

    July 9, 20250 Views

    The best earbuds we’ve tested for 2025

    July 9, 20250 Views
    Our Picks

    My Health Anxiety Means I Won’t Use Apple’s or Samsung’s Smartwatches. Here’s Why

    December 22, 2025

    You can now buy the OnePlus 15 in the US and score free earbuds if you hurry

    December 22, 2025

    Today’s NYT Connections: Sports Edition Hints, Answers for Dec. 22 #455

    December 22, 2025

    Subscribe to Updates

    Get the latest creative news from FooBar about art, design and business.

    Facebook X (Twitter) Instagram Pinterest
    • About Us
    • Contact us
    • Privacy Policy
    • Disclaimer
    • Terms and Conditions
    © 2026 techupdatealert. Designed by Pro.

    Type above and press Enter to search. Press Esc to cancel.