How Experience and Management Skills Improve Data Analysis for Security Professionals
The other day, I found myself reflecting on my career and how things have evolved as I age. In doing so, I realized how fortunate I am to have started my career at such an early age. I can still remember being fascinated by using BBSs (Bulletin Board Systems), patching motherboards, working with Windows 3.11 and NT, as well as using ICQ and various other systems and operating systems. By the way, just mentioning these applications makes me feel like my beard has gained a few more white hairs.
I strongly believe that everything we learn in life contributes to the abilities and knowledge we carry with us — our knowledge is built brick by brick. Sure, sometimes some of those bricks fall on our heads, but we pick them up and keep building. Unfortunately, one of these cases where the bricks fall is when people try to see the technical and the management as separate ideas instead of combining the skills and such to improve both.
While my career has been centered on technical work, I've also gained experience working with management, which has taught me the importance of regulations, compliance, and clear communication with stakeholders. Presentations often rely on KPIs, graphs, and Excel to demonstrate risks and impacts. I've learned to apply these management skills, like simplifying complex information, to my technical tasks, especially in research and penetration testing.
As many know, a crucial phase of penetration testing involves collecting data, and the volume of data can be overwhelming, especially when dealing with medium to large-sized companies that have thousands of devices. During the reconnaissance process, a widely used tool is Nmap. However, even with the current data output options (-oA), the sheer volume of information can still be challenging to analyze effectively.
So, why not simplify things and leverage Excel’s dynamic features? By creating filters, sorting for ports, services, status, and more, we can make the data much easier to identify and then dive deeper into our engagement.
Aiming to streamline the process while presenting the information more clearly to clients, I decided to develop the following bash script that parses data from an Nmap-style file (.nmap) and converts it into a CSV format that can be easily imported into Excel.
#!/bin/bash
# Check if input file is provided
if [ "$#" -ne 2 ]; then
echo "Usage: $0 <input_nmap_file> <output_csv_file>"
exit 1
fi
# Input and Output files
input_file=$1
output_file=$2
# Check if the input Nmap file exists
if [ ! -f "$input_file" ]; then
echo "Error: Input file does not exist."
exit 1
fi
# Create the CSV file and add the header
echo "host,port,state,service" > "$output_file"
# Parse the Nmap file
host=""
while IFS= read -r line; do
# Check if line starts with "Nmap scan report for" (indicating a new host)
if [[ $line =~ ^Nmap\ scan\ report\ for\ (.*) ]]; then
host="${BASH_REMATCH[1]}"
echo "Found host: $host" # Debugging: print host when found
fi
# Check if line contains port info (TCP/UDP ports)
# Adjust the regex to ensure matching is more flexible with spaces and tabs
if [[ $line =~ ([0-9]+)/(tcp|udp)[[:space:]]+(open|closed)[[:space:]]+([a-zA-Z0-9\-]+) ]]; then
port="${BASH_REMATCH[1]}"
protocol="${BASH_REMATCH[2]}"
state="${BASH_REMATCH[3]}"
service="${BASH_REMATCH[4]}"
# Debugging: print port, state, service
echo "Found port: $port/$protocol, state: $state, service: $service"
# Write the host, port, state, and service to the CSV file
echo "$host,$port/$protocol,$state,$service" >> "$output_file"
fi
done < "$input_file"
echo "CSV file created at $output_file"
After setting up the correct permissions to execute it, you can simply type the following:
In addition to the on-screen display, you will have generated an output.csv file that, once processed, may have a graphical representation as below.
Now it will be much easier to organize it by host, sort by port, filter by status or anything else that you want to do with the data. As a basic example, I created a pivoted table as follows:
To wrap it all up, this simple yet effective tool allows us to streamline the process of analyzing Nmap data, making it far more manageable and easier to present to clients or stakeholders. By converting raw data into a structured CSV format, we can leverage Excel’s powerful features to quickly identify patterns, sort information, and generate meaningful insights. This not only simplifies our daily work, but also improves communication, enabling us to focus more on the critical aspects of our penetration testing and vulnerability assessments. Ultimately, it’s a small innovation that makes a big difference in both the efficiency and clarity of the entire process.
If you're interested in exploring ways to improve the analysis and handling of large volumes of data in penetration testing, I invite you to visit and contribute to this project on our GitHub page (https://github.com/ProfessionallyEvil/). Your insights and contributions would be greatly appreciated as we continue to refine and enhance the tools and techniques used in this field.
Jordan Bonagura
Senior Security Consultant at Secure Ideas
Jordan is a Security Consultant at Secure Ideas. If you have any further questions, feel free to reach out to him at jordan.bonagura@secureideas.com or you can find him on LinkedIn.
Read More by Jordan:
Best Practices and Risks Considerations in Automation like LCNC and RPA