SentinelOne Threat-Hunting Lab

SentinelOne had a threat-hunting course that had a small presentation followed by a lab where I was given the opportunity to play around with the SentinelOne console and go hunting in a simulated environment. While I did not end up playing around in the environment for too long I did thoroughly enjoy both the presentation and getting to use SentinelOne. I like how everything is mapped out to the Mitre framework and it was a fairly easy to use console which is not always something you get with EDR solutions. All in all it felt like a clean solution that offers a lot of great features, some of which I got to test out in the lab. I'm posting my notes on the presentation below for anyone that would like to see how it was.

Think Like a Threat Hunter

  • Blue Team

    • Working knowledge of defense procedures and systems such as SIEMs, endpoint security technologies and log aggregation methods

    • Extensive network terrain knowledge

      • Cant protect what you don't see or know

  • Intel

    • Understanding attack vectors that go beyond a list of IOCs

    • Mitre Att&ck

      • Adversarial tactics, techniques, and common knowledge

      • Based on real-world observations

      • Free, open, global

      • Common language

      • Community driven

      • Shift from indicators to behaviors

    • Can apply non-technical defensive measures to technique

      • Policy change

      • Risk acceptance

  • Red Team

    • Threat hunters should have experience thinking and acting like an attacker

    • Red-teamer's mindset is moving through an organization undetected

    • Understand how to utilize exploits, how to elevate priveleges, and write scripts


  • Paranoia

    • Find it

    • Contain it

    • Remediate it

  • Digital Forensics and Incident Response

    • DFIR knowledge equips a hunter with the foresight of knowing where to look for suspicious or anomalous behaviors and artifacts

    • Tools

      • Autopsy

      • Yara

      • Rekall

      • Wireshark

      • Elastic Database


  • Searching != Hunting

    • Searching is more tactical

      • Point-in-time value

      • More based on IOCs

      • Yes or no answers

      • Third-party intelligence

      • Quick answers to immediate concerns

    • Hunting is more strategic

      • Continuing value

      • Unique network knowledge

      • Scientifically postulated and tested

      • Iterative process

  • Creating a hypothesis

    • What data is accessible/indexed/searchable

    • What information is within the data

    • What external information can I inject into a search

    • Where should the search focus?

  • How are you accounting for normal?

  • Does the query make it hard to filter out normal?

  • How useful are the results?

That's is for my notes from the lab, one last piece I have is a graphic from the powerpoint that I thought was a good visual so I grabbed a screenshot of it as you can see below

Last updated