Table of Contents
Setting the Stage: Why we need artifact parsing automation
Hello friend, this article is a bit lengthy, but it took me two days to create one day spent thinking and writing the ideas, and another day of implementation filled with errors, misconfigurations, and a strong determination to achieve this new invention of Windows artifact digital forensics investigation using SIEM
solutions for easy correlation and artifact extraction.
The idea for this technical article came to me as a digital forensics enthusiast while working on digital forensics labs. I often came across Eric Zimmerman's tools
for artifact parsing, such as AmcacheParser
, EvtxECmd
, MFTECmd
, MFTExplorer
, Registry Explorer
, and many others.
The daily tasks of a security analyst involve monitoring, analyzing alerts, and solving tickets. The logs that are mostly focus on are Windows native logs
, Sysmon
, and Syslogs
. But why wait until an incident occurs to take an image from the device and then access the digital artifacts using specific tools for digital forensics analysis?
Instead, why not send these parsed artifacts, ready for correlation and investigation, directly to the SIEM
solution before, during, and after the incident? This would enrich the SIEM logs
with a greater view of device health, executed executables, accessed folders, and more.
In today's blog, we'll explore how to automate the process of parsing a couple of Windows artifacts, using scheduled tasks to send them to Splunk SIEM
via the Universal Forwarder
. We'll also cover the installation and configuration of Splunk and the Universal Forwarder
. All the tools we'll use are free, with trial limitations.
Essentials for the Journey: VMware Network Connection Establishment
All you need is a cup of coffee
, VMware
with a Kali Linux
machine and a Windows
machine. The Kali Linux machine will act as our server to host the Splunk SIEM
, and the Windows machine will have the Universal Forwarder installed.
We will explore now how to establish a connection between both virtual machines using the NAT feature in VMware, by Opening VMware Workstation/Player, then click on the VM
, Settings
, Network Adapter
, select the NAT
network connection, so the VMs can access the internet through the host's IP address, but they cannot directly communicate with physical devices on the external network. They can only communicate with each other within the virtual network.
After configuring both machines to communicate over the NAT network, we will test the connectivity by using the ping
command once from the Kali Linux machine and once from the Windows machine.
PS C:\Windows\system32> ping xxx.xxx.xxx.xxx
Pinging xxx.xxx.xxx.xxx with 32 bytes of data:
Reply from xxx.xxx.xxx.xxx: bytes=32 time=1ms TTL=64
Reply from xxx.xxx.xxx.xxx: bytes=32 time=1ms TTL=64
Reply from xxx.xxx.xxx.xxx: bytes=32 time<1ms TTL=64
Reply from xxx.xxx.xxx.xxx: bytes=32 time<1ms TTL=64
Ping statistics for xxx.xxx.xxx.xxx:
Packets: Sent = 4, Received = 4, Lost = 0 (0% loss),
Approximate round trip times in milli-seconds:
Minimum = 0ms, Maximum = 1ms, Average = 0ms
┌──(kali㉿kali)-[~]
└─$ ping xxx.xxx.xxx.xxx
PING xxx.xxx.xxx.xxx (xxx.xxx.xxx.xxx) 56(84) bytes of data.
64 bytes from xxx.xxx.xxx.xxx: icmp_seq=1 ttl=128 time=0.939 ms
64 bytes from xxx.xxx.xxx.xxx: icmp_seq=2 ttl=128 time=0.764 ms
64 bytes from xxx.xxx.xxx.xxx: icmp_seq=3 ttl=128 time=0.738 ms
64 bytes from xxx.xxx.xxx.xxx: icmp_seq=4 ttl=128 time=0.791 ms
64 bytes from xxx.xxx.xxx.xxx: icmp_seq=5 ttl=128 time=0.688 ms
As we can see, the first step to establish a connection between both devices is complete. Now, we'll move on to the second part and explore how to parse the artifacts.
PowerForensic: The Master of Artifact Parsing
The $MFT
(Master File Table) is a hidden system file in NTFS that stores metadata about all files and directories. When you access your C:\
drive, you won't see the $MFT
file and other artifacts ,due to the system's file handling and security settings. For forensic purposes, tools like FTK Imager
or EnCase
are used to extract and analyze these files, as they can bypass security restrictions and properly interpret the metadata.
However, we also have other PowerShell tools that can access and parse these files. One such tool is PowerForensic
. The purpose of PowerForensics
is to provide an all-inclusive framework for hard drive forensic analysis. It currently supports NTFS and FAT file systems.
PowerForensics
is built on a C# Class Library (Assembly) that provides an public forensic API. All of this module's cmdlets are built on this public API and tasks can easily be expanded upon to create new cmdlets.
You can explore these two links for more information about the tool one from the PowerShell Gallery
, that will provide information about the cmdlets and how to use them : PowerForensics and the other from
GitHub
that will be used for installation: Embed GitHub
PowerUp Your Forensics: Installing PowerForensics and Running Cmdlets
Now, I will provide the PowerShell commands that need to be executed inside PowerShell as an (Administrator)
in order to install the PowerForensics
modules and run its cmdlets.
# Step 1: Check the Execution Policy
Get-ExecutionPolicy
# Step 2: Set the Execution Policy to RemoteSigned: allows running PowerShell scripts on the local machine that are either locally created or signed by a trusted publisher, but restricts running unsigned scripts from remote sources.
Set-ExecutionPolicy RemoteSigned -Scope CurrentUser
# Step 3: Import the PowerShellGet module
Import-Module PowerShellGet
# Step 4: Download the PowerForensics repository from GitHub (https://github.com/Invoke-IR/PowerForensics)
# Step 5: Unzip the PowerForensics-master folder and change to the PowerForensics directory to PowerForensics
# Step 6: Move the folder to the PowerShell Modules directory
Move-Item -Path "C:\path\to\PowerForensics-master" -Destination "C:\Program Files\WindowsPowerShell\Modules\PowerForensics\Modules"
# Step 7: Import the module
Import-Module 'C:\Program Files\WindowsPowerShell\Modules\PowerForensics\Modules\PowerForensics\PowerForensics.psd1'
Now that the module is installed, we will perform some tests using a couple of modules. The first one is Get-ForensicRegistryValue
, which parses a registry hive and returns the values of a specified key, Then, save the output results to ForensicRegistryValueRun.csv
in the C:\
directory.
Get-ForensicRegistryValue -HivePath C:\Windows\system32\config\SOFTWARE -Key Microsoft\Windows\CurrentVersion\Run | Export-Csv -Path "C:\ForensicRegistryValueRun.csv" -NoTypeInformation
Now, we will perform the same procedure for the $MFT
file using the Get-ForensicFileRecord
cmdlet. This cmdlet parses the $MFT
file and returns an array of FileRecord
entries. We will save the results in the $mft
variable, and then execute $mft
to view the metadata. You can follow the same steps as the previous code block to save the results into a .csv
file, which we will need at the end of the article.
$mft = Get-ForensicFileRecord
$mft
The same procedure applies to the Get-ForensicUsnJrnl
cmdlet, which parses the $UsnJrnl
file's $J
data stream to return UsnJrnl
entries, then save them to Get-ForensicUsnJrnl.csv
file. The Update Sequence Number (USN) Journal
is a feature of NTFS, activated by default on Vista and later, that keeps a record of changes made to the NTFS volume, such as file or directory creation, deletion, or modification.
$usn = Get-ForensicUsnJrnl
$usn | Export-Csv -Path "C:\Get-ForensicUsnJrnl.csv" -NoTypeInformation
PowerShell Script for Creating the Master Header
Now we will create a PowerShell That basically will multiple PowerShell cmdlets execution, This script extracts registry data from the SOFTWARE
hive in Windows, focusing on specific registry keys such as CurrentVersion
and Explorer\User Shell Folders.
It uses the Get-ForensicRegistryKey
cmdlet to gather the data and saves each registry key's values to a CSV file in a designated directory.
Additionally, it retrieves the USN Journal data using the Get-ForensicUsnJrnl
cmdlet and exports it to a CSV file. Each file is dynamically named based on the registry key or USN Journal data being extracted. The saved files are stored in the C:\WindowsForensicSplunkExtraction
directory for easy access.
# Define the paths for the SOFTWARE and SYSTEM hives
$softwareHive = "C:\Windows\system32\config\SOFTWARE"
# Define the registry keys for SOFTWARE hive
$registryKeysSoftware = @(
"Microsoft\Windows NT\CurrentVersion",
"Microsoft\Windows Defender\Exclusions",
"Microsoft\Windows NT\CurrentVersion\ProfileList",
"Microsoft\Windows\CurrentVersion\Explorer\User Shell Folders",
"Microsoft\Windows NT\CurrentVersion\Time Zones",
"Microsoft\Windows NT\CurrentVersion\Winlogon",
"Microsoft\Windows\CurrentVersion\Explorer\Advanced",
"Microsoft\Windows\CurrentVersion\Uninstall",
"Microsoft\Windows\CurrentVersion\Installer\UpgradeCodes"
)
# Loop through each registry key in the SOFTWARE hive
foreach ($key in $registryKeysSoftware) {
Write-Host "Extracting data from: $key"
# Initialize an empty results array for each key
$results = @()
# Get the registry key values
$keyData = Get-ForensicRegistryKey -HivePath $softwareHive -Key $key
# Add the data to the results array
$results += $keyData
# Export data to CSV with dynamic file name, saved to C:\WindowsForensicSplunkExtraction
$fileName = "C:\WindowsForensicSplunkExtraction\Get-ForensicRegistryKey" + ($key -split '\\' | Select-Object -Last 1) + ".csv"
$results | Export-Csv -Path $fileName -NoTypeInformation
Write-Host "Registry data has been saved to: $fileName"
}
# Get the USN Journal data and export to CSV with dynamic filename
$usn = Get-ForensicUsnJrnl
$usnFileName = "C:\WindowsForensicSplunkExtraction\Get-ForensicUsnJrnl.csv"
$usn | Export-Csv -Path $usnFileName -NoTypeInformation
Write-Host "USN Journal data has been saved to: $usnFileName"
You can add or restructure keys as needed to monitor their values, such as for the $MFT
file and more. You might wonder why I didn’t do that here, and that's because the free trial version of Splunk Enterprise
allows only 500MB/day for 60 days
of data to be injected, and the $MFT
contains a large amount of data. However, as mentioned, we will review the $MFT data in Splunk later in the article.
After exploring the upper PowerShell script, you can save it as RegistryForensics.ps1
or any other name you prefer, with the location preferred.
Testing Phrase: insuring we're not Noobies.
Now, we will simply execute the PowerShell script while monitoring the directory where we have saved the extracted and parsed artifacts, including the registry keys.
PS C:\Users\semo\Desktop> .\RegistryForensics.ps1
Extracting data from: Microsoft\Windows NT\CurrentVersion
Registry data has been saved to: C:\WindowsForensicSplunkExtraction\Get-ForensicRegistryKeyCurrentVersion.csv
Extracting data from: Microsoft\Windows Defender\Exclusions
Registry data has been saved to: C:\WindowsForensicSplunkExtraction\Get-ForensicRegistryKeyExclusions.csv
Extracting data from: Microsoft\Windows NT\CurrentVersion\ProfileList
Registry data has been saved to: C:\WindowsForensicSplunkExtraction\Get-ForensicRegistryKeyProfileList.csv
Extracting data from: Microsoft\Windows\CurrentVersion\Explorer\User Shell Folders
Registry data has been saved to: C:\WindowsForensicSplunkExtraction\Get-ForensicRegistryKeyUser Shell Folders.csv
Extracting data from: Microsoft\Windows NT\CurrentVersion\Time Zones
Registry data has been saved to: C:\WindowsForensicSplunkExtraction\Get-ForensicRegistryKeyTime Zones.csv
Extracting data from: Microsoft\Windows NT\CurrentVersion\Winlogon
Registry data has been saved to: C:\WindowsForensicSplunkExtraction\Get-ForensicRegistryKeyWinlogon.csv
Extracting data from: Microsoft\Windows\CurrentVersion\Explorer\Advanced
Registry data has been saved to: C:\WindowsForensicSplunkExtraction\Get-ForensicRegistryKeyAdvanced.csv
Extracting data from: Microsoft\Windows\CurrentVersion\Uninstall
Registry data has been saved to: C:\WindowsForensicSplunkExtraction\Get-ForensicRegistryKeyUninstall.csv
Extracting data from: Microsoft\Windows\CurrentVersion\Installer\UpgradeCodes
Registry data has been saved to: C:\WindowsForensicSplunkExtraction\Get-ForensicRegistryKeyUpgradeCodes.csv
USN Journal data has been saved to: C:\WindowsForensicSplunkExtraction\Get-ForensicUsnJrnl.csv
PS C:\Users\semo\Desktop>
As we can see, our work is complete the PowerShell script has been executed, and everything is done. But how will we ensure that new data is added and parsed? This is where the scheduled task comes in.
I won't repeat it: the scheduled task is the ultimate repeater.
To create a scheduled task
, start by opening Task Scheduler
. Press Windows + R
, type taskschd.msc
, and press Enter
. Then, in the Task Scheduler window, click Create Basic Task
.
Enter a task name
and description
, then select a trigger such as daily
. Next, choose the action, usually Start a Program
, and specify the program PowerShell.exe
with the script path we created C:\Users\semo\Desktop\RegistryForensics.ps1
, Finally, review your settings and click Finish
to schedule the task.
Check the underlined picture below that demonstrates a step-by-step
guide for creating the scheduled task.
As we can see, we have set up the scheduled task to run daily, 5 minutes
after the user logs on, and every 5 minutes
thereafter. The script will execute, deleting the previous file and adding the new version. With administrative
privileges, this ensures the PowerShell script is executed correctly.
We can test the process by running it using the Run
button while keeping an eye on the designated location.
To double-check, we can simply delete
the created file, then restart the device. After waiting for 5 minutes
, we can check the designated location again to ensure the parsed registry and other digital artifacts have been recreated.
Splunk Installation: The Master of Our Story
Now, we need to install Splunk
, the powerful and extensible data platform that processes data from any cloud, data center, or third-party tool. All we need to do is create an account and install Splunk Enterprise
on our Linux
machine using this link: Splunk Free Splunk Trial | Download Splunk Enterprise Free for 60 Days | Splunk and the
Universal Forwarder
on our Windows
machine: Splunk Download Universal Forwarder for Remote Data Collection | Splunk, basically
Universal Forwarders
provide reliable, secure data collection from remote sources and forward that data into Splunk software for indexing and consolidation
Check out the underlined picture, which will show you the details of what you’ll see once you open the portal and create your account.
Now that we have Splunk Enterprise on our Linux machine
, we just need to install the .deb
file we downloaded using the sude apt install ./[[installedfile.deb]]
as the APT
manages dependencies and ensures proper installation on Debian-based systems. Then running Splunk using the ./splunk start
located command inside the /opt/splunk/bin
directory.
Press Enter to accept the license, then provide a username
and password
for the admin account, after that you are ready to access Splunk Enterprise portal.
Sending the logs: The scent of success
Now, we need to add the data to Splunk by using the data sent from the Universal forwarder
. This is done by selecting the receiving data configuration, then creating a new receiving port with the default port 9997
.
Essentially, this will open port 9997
on the Linux machine in a listening mode to capture the logs sent from the Universal Forwarder
, which we will deploy on the Windows machine, Check the images below, which will show in detail how to configure the Receiving data port.
As we can see, we now have our port ready to capture the data that will be sent to it.
Universal Forwarder: The voice in the forest.
Now, after installing the forwarder
on the Windows machine, we'll configure it as follows Launch the installed .msi
executable, accept the license agreement
, and use the forwarder as a on-prem Splunk Enterprise instance.
We won't
provide a password for the certificate or upload one; instead, we will use the default Splunk certificate, so click Next
.
After that, choose to use it on the local system
to access all the data and forward it to the machine. Click Next.
Click on directory then enter the path of the logs that have been created to monitor, specifically C:\WindowsForensicSplunkExtraction
, while unchecking
the boxes for Windows event logs, Active Directory, and performance monitoring.
Enter the username
and password
related to the admin account.
For the deployment server, skip by clicking Next
, then enter the IP address
of the Kali Linux machine with port 9997
in the receiving indexer. You can check the Linux machine's IP by running the ip a
command in the terminal and looking for the eth0
network interface IP.
After this, we are ready to execute
it.
Connection is complete, Sir: Now, I want to test on SPL.
Now, by entering the Enterprise Splunk on the Kali Linux
machine, we can access the Search and Reporting application
and check the data summary
. We can view the Windows device is connected, the files that have been accessed as the source, and the source path will be the CSV
file, which is the data format for the logs.
We can easily create the underline SPL query to check the logs with the file name setupapi.dev.log
and present the data as a visualization
based on the top 20 results,
limited by the data_secoud
field, which was displayed in the Interesting Fields
section.
index=main source="C:\\WindowsForensicSplunkExtraction\\Get-ForensicUsnJrnl.csv" FileName="setupapi.dev.log"| top limit=20 date_second
We can now check the Monitoring Console
, which will provide the amount of injected data
and the disk usage that holds the data written by the indexer
to the index
.
Shut down the system: We need a break
After this long journey, we will shut down
Splunk Enterprise, the Universal Forwarder, and the scheduled tasks.
On the Windows machine, we will navigate to the C:\Program Files\SplunkUniversalForwarder\bin
folder and stop
the Universal Forwarder execution using the stop
command. Then, check the status using the status
command.
PS C:\Program Files\SplunkUniversalForwarder\bin> .\splunk.exe stop
SplunkForwarder: Stopped
PS C:\Program Files\SplunkUniversalForwarder\bin> .\splunk.exe status
SplunkForwarder: Stopped
We will do the same on the Kali Linux machine by navigating to the /opt/splunk/bin
directory and stopping
the Splunk Enterprise instance, using the same the upper command, Then Running netstat on the indexer to check for active connections.
┌──(root㉿kali)-[/opt/splunk/bin]
└─# netstat -an | grep 9997
tcp 0 0 0.0.0.0:9997 0.0.0.0:* LISTEN
Since there are no ESTABLISHED
connections, this suggests No forwarders
are currently sending data, also we have a double check by running the query below If no results appear
, it means the forwarder is not sending data
.
index=_internal source="*metrics.log" group=tcpin_connections
Now, by going to the Scheduled Tasks on the Windows machine, we will open the Scheduled Task Manager
and disable
the created scheduled task to prevent the PowerShell script from executing every 5 minutes.
I can't sleep; we need that $MFT file
As I mentioned earlier, when we executed $mft = Get-ForensicFileRecord
, we can export the data as we did with previous commands to save it to a CSV
file. But how can we view it now? Essentially, in Splunk, we can upload parsed data
, as Splunk works smoothly with CSV files
. Now, we need to take that parsed $MFT CSV file and upload it to Splunk.
Check the image below, where I go through how to add data using the upload method
.
After that, we will select the timestamp to configure
or advanced
the settings to display all related data. The delimiter settings, by default
, are set to comma
, which is the delimiter in CSV
files., Then, we will create a source type
, naming it as you prefer, with the creation of index named WindowsForensics
, with the host field set as SemoWindowsForensics
. Essentially, the data will be associated with the specified host
and source type
.
Checking the uploaded data by writing the index name with the index
command, and voilà!
We have the parsed data uploaded.
A Masterpiece was Created
"I hope you’ve made it through the entire article! This one’s easily the best I’ve written so far, though I can’t predict the future, I’m aiming to top it. If you have any questions, feel free to DM me on LinkedIn I’m happy to chat!" www.linkedin.com