Heap-overflowing Llama.cpp to RCE

retr0reg | 248 points

This is really incredible work. And the fact that you are 15 is blowing my mind. You have a really bright future ahead of you, and your parents must be really proud (at least I would be if you were my kid.) Hit me up if you want a summer internship finding security vulnerabilities at a hotel software startup (access control, property management, etc)

VladVladikoff | 25 days ago

This is amazing—made even more impressive by the fact that the author is just 15 years old!

Also, it's nice to see this mentioned:

> For this 10k-word write-up, I spent around a month finishing up the main parts, and refining/editing it took an extra while. Writing this is indeed a painful process. I spent the entire day on the weekend and 4-5 hours during the rest of the week working on it for around two weeks.

It's the kind of behind-the-scenes effort that often goes unspoken.

yamrzou | a month ago

I tried to execute the PoC by running the following:

  git clone https://github.com/ggml-org/llama.cpp.git && cd llama.cpp
  git checkout c0d4843225eed38903ea71ef302a02fa0b27f048 # Checkout a revision prior to the exploit fix in 1d20e53c40c3cc848ba2b95f5bf7c075eeec8b19
  mkdir build-rpc && cd build-rpc
  cmake .. -DGGML_RPC=ON
  cmake --build . --config Release
  cd bin/
  ./rpc-server -p 50052
In a second terminal:

  nc -lvp 1337
Then running the exploit code in a third terminal (from llama.cpp/build-rpc/bin directory):

  pip install pwntools
  python exp.py # From https://gist.github.com/retr0reg/d13de3fde8f9d138fe1af48e59e630a9
It failed at Stage Three: Bypass boundary check via libggml and raised an EOFError. The RPC server exited with Segmentation fault. Any idea why?
yamrzou | a month ago

sheesh. the visual aesthetics and script behavior on your blog are so tastefully executed. great job!

rboyd | 25 days ago

Can anyone tl/dr this? Does this mean that its possible for a maliciously crafted LLM to execute arbitrary code via an exploit in llama.cpp?

zaphod420 | 25 days ago

[dead]

curtisszmania | 25 days ago

[flagged]

m00dy | 25 days ago

prodigies are amazing, but I often wonder what they end up doing later in life when the intelligence gap between them and their peers converges to zero.

behnamoh | 25 days ago

Not surprising, llama.cpp code is a mess.

It's sad that hacked things that emerge first are way more popular than properly done projects that come later.

om8 | 25 days ago