Cyber Work
Cyber Work
Nov 30, 2020
Securing Apple devices: Managing growing cyberattacks and risk
Play • 45 min

Dive into all things Apple security with today’s guest, Kelli Conlin, Security Solutions Specialist at Jamf. Learn about securing devices across multiple operating systems, the hidden-in-plain-sight Apple security Bible, and why Kelli’s mom isn’t allowed to use the 15-year-old Mac laptop Kelli is still hanging on to after all these years.

– Enter code “cyberwork” to get 30 days of free training with Infosec Skills: https://www.infosecinstitute.com/skills/

– View transcripts and additional episodes: https://www.infosecinstitute.com/podcast

Kelli Conlin is a Security Solutions Specialist at Jamf focused on helping organizations be more secure with Apple. Prior to joining Jamf, Kelli was an Intelligence Analyst in the U.S. Air Force supporting special operations before starting an IT career path. Kelli currently lives in Tampa, FL with her husband, son, two cats and a miserable husky.

About Infosec
Infosec believes knowledge is power when fighting cybercrime. We help IT and security professionals advance their careers with skills development and certifications while empowering all employees with security awareness and privacy training to stay cyber-safe at work and home. It’s our mission to equip all organizations and individuals with the know-how and confidence to outsmart cybercrime. Learn more at infosecinstitute.com.

Hacker Valley Studio
Hacker Valley Studio
Hacker Valley Media
Episode 114 - The Good, Bad, and Ugly of Threat Intelligence with Patrick Coughlin
In this episode of the Hacker Valley Studio podcast, hosts Ron and Chris interview Patrick Coughlin, Co-Founder and CEO of TruSTAR. Patrick began his career as a security analyst in Washington D.C. and the middle east. By working with government contractors, multinational corporations, and counter-terrorism units, Patrick learned that the biggest challenge that security analysts have is retrieving the needed information from disparate data sources. This discovery led Patrick to founding TruStar. Patrick’s focus is to help organizations automate the collection and curation of threat intelligence data. Patrick’s analytical prowess originated from working at Booz Allen Hamilton where he learned a fundamental skill that all cybersecurity analysts should have - how to put together a slide deck. This skill helped Patrick articulate the importance of threat intelligence to leaders in the government and private sector. As the episode progresses, Patrick details the differences between threat intelligence requirements for national security and enterprise. For enterprise threat intelligence programs, the goal is to accelerate automation of detection and rarely attribution. Patrick also mentions automation is only as effective as the data is cleaned, normalized, and prioritized. What about the good, bad, and ugly of threat intelligence? Patrick describes that an organization can thrive by leveraging internal intelligence. This can be overlooked when organizations are fixated on buying threat data feeds and subscribing to ISAC feeds. Most enterprise organizations have a detection and response stack that is constantly providing information about threats relevant to their organization - which serves as great threat intelligence data. Chris and Ron ask Patrick about the science vs art aspects of cybersecurity and threat intelligence. Patrick describes that there is room for both art and science in threat intelligence. While new concepts are being discovered, there is art in finding the needle in the haystack. However, at some point, intuition can be described into steps that a machine can repeat. For example, after years of analytical practice an analyst can describe how and why they are tagging threat intelligence related data in such a way that can be repeated by other analysts or automation. This episode covers an abundance of tactics and techniques for threat intelligence analysts. Patrick describes the best place to begin automating threat intelligence is detection. An analyst can ask the question, “How do I get sources of known bad indicators into my detection stack so that I could drive high fidelity detections?”. As false positives decrease, your mean time to detection (MTTD) and resolution (MTTR) decrease which makes your threat intelligence and security operation team members more effective. 0:00 - Intro 1:53 - This episode features Patrick Coughlin, Co-Founder and CEO of TruSTAR 2:30 - Patrick’s background and start as a security analyst 5:19 - How to automate threat intelligence while reducing analyst fatigue 7:05 - How Patrick cultivated his analyst prowess 8:43 - Articulating threat intelligence to government and enterprise organizations 11:09 - Can a threat intelligence program be automated? 17:21 - Patrick’s experience of “good” and “bad” threat intelligence programs 20:31 - Logic vs Intuition in threat intelligence 27:04 - Artificial Intelligence and Machine Learning to make threat intelligence decisions 28:42 - Where to start when automating threat intelligence 30:02 - How to stay in touch with Patrick Coughlin Links: Connect with Patrick Coughlin on LinkedIn Link to Patrick’s company TruSTAR Learn more about Hacker Valley Studio. Support Hacker Valley Studio on Patreon. Follow Hacker Valley Studio on Twitter. Follow hosts Ron Eddings and Chris Cochran on Twitter. Learn more about our sponsor ByteChek. Take our FREE course for building threat intelligence programs by visiting www.hackervalley.com/easy
31 min
7 Minute Security
7 Minute Security
Brian Johnson
7MS #450: DIY Pentest Dropbox Tips - part 4
Hey friends! We're continuing our series on pentest dropbox building - specifically playing off last week's episode where we started talking about automating the OS builds that go on our dropboxes. Today we'll zoom in a little closer and talk about some of the specific scripting we do to get a Windows 2019 Active Directory Domain Controller installed and updated so that it's ready to electronically punch in the face with some of your mad pentesting skills! Specifically, we talk about these awesome commands: tzutil /s "Central Standard Time" - this is handy to set the time zone of your server build powercfg.exe -change -standby-timeout-ac 0 will stop your VM from falling asleep Invoke-WebRequest "https://somesite/somefile.file" -OutFile "c:\some\path\somefile.file" is awesome for quickly downloading files you need. Couple it with Expand-Archive "C:\some\path\some.zip" "c:\path\to\where\you\want\to\extract\the\zip" to make auto-provisioning your toolkit even faster! Don't like it that Server Manager loves to rear its dumb head upon every login? Kill the task for it with Get-ScheduledTask -TaskName ServerManager | Disable-ScheduledTask -Verbose. Byeeeeee!!!! I love Chrome more than I love IE/Edge, so I auto install it with: $Path = $env:TEMP; $Installer = "chrome_installer.exe"; Invoke-WebRequest "http://dl.google.com/chrome/install/375.126/chrome_installer.exe" -OutFile $Path\$Installer; Start-Process -FilePath $Path\$Installer -Args "/silent /install" -Verb RunAs -Wait; Remove-Item $Path\$Installer Now get all the Windows updates! Install-PackageProvider -name nuget -force Install-Module PSWindowsUpdate -force Import-Module PSWindowsUpdate Get-WindowsUpdate Install-WindowsUpdate -AcceptAll -IgnoreReboot Then rename your machine: Write-Host "Picking a new name for this machine...you'll need to provide your admin pw to do so" Rename-Computer -LocalCredential administrator -PassThru Write-Host "New name accepted!" When you're ready to install Active Directory, you can grab the RSAT tools: Write-Host "Lets install the RSAT tooleeeage!" add-windowsfeature -name rsat-adds And then the AD domain services themselves: Write-Host "Now lets install the AD domain services!" add-windowsfeature ad-domain-services Then install the new forest: install-addsforest -domainname your.domain -installdns -DomainNetbiosName yourdomain
56 min
Hacker Public Radio
Hacker Public Radio
Hacker Public Radio
HPR3252: Simple JSON querying tool (also YAML, and to a lesser extent XML)
JSON Json is a cool little data serialization language, that allows you to easily and clearly demarcate blocks of data by nesting data structures such as lists (enclosed by square brackets) and key-value pairs or "dictionaries" (enclosed by curly braces). So that in the end you get something that looks like this { "first list" : [ "element1", "element2", {"element3" : "is another k-v pair", "but contains" : ["a" , "list", "of", "words"]}] , "this value is a string" : "1" , "and this is a number" : 23 , "and floating point" : 1.413 } Aside from: * Lists are enclosed in [] and each element is separated by , * Key-value pair lists are enclosed in {} and have the key and value separated by : and each pair is separated by , * Keys have to strings quoted with double quotes * Numbers may be left unquoted (but just in value fields) There are no restrictions to what you can do with JSON. Given how explicit the syntax is then, it makes for very easy parsing, and there are plenty of good parser out there. My favourite JSON parser is jq(1). A canonical representation of the JSON example above can easily be obtained with jq by simply calling jq '' file.json (or piping the file through stdin, or even putting the contents properly quoted as the second argument). { "first list": [ "element1", "element2", { "element3": "is another k-v pair", "but contains": [ "a", "list", "of", "words" ] } ], "this value is a string": "1", "and this is a number": 23, "and floating point": 1.413 } You can also use jq in a shell script to obtain, for example the second element of the first list: $ jq '."first list"[1]' example.json "element2" So to get the value associated to a key you use the notation .key and to get the k-th element you use the notation [k-1]. To remove the quotes on the string you can use the -r flag which stands for raw output. jq(1) also gives you a few more functionalities that can be useful like getting the number of elements in a list with the length function. $ jq 'length' example.json 3 $ jq '."first list"[2]."but contains" | length' 4 Another useful feature is getting the list of keys from a key-value pair list which can be done with the function keys $ jq '."first list"[2] | keys[]' example.json "but contains", "element3" The query language is much much more flexible than this, but for most cases this should be enough for simple configuration querying. YAML and XML?? The yq project allows one to use the exact same syntax as jq to query, and emit (and therefore also transcode) yaml and XML, extending the usefulness of the query language. So for example looking at the previous file through yq gives: $ yq -y '' example.json first list: - element1 - element2 - element3: is another k-v pair but contains: - a - list - of - words this value is a string: '1' and this is a number: 23 and floating point: 1.413 And the output of this can be of course queried with yq itself, or can be used to feed into whatever application requires a yaml input (I guess it lacks the triple dash at the top, but that is actually the only warning I get from passing that abomination to yamllint) Similarly xq can be used to query XML files with the same language. However, to emit these files from json you need to use yq -x like so: $ yq -x '' example2.json <file> <first_list>element1</first_list> <first_list>element2</first_list> <first_list> <element3>is another k-v pair</element3> <but_contains>a</but_contains> <but_contains>list</but_contains> <but_contains>of</but_contains> <but_contains>words</but_contains> </first_list> <this_value_is_a_string>1</this_value_is_a_string> <and_this_is_a_number>23</and_this_is_a_number> <and_floating_point>1.413</and_floating_point> </file> where the original (modified) file example2.json looks like: { "file": { "first_list": [ "element1", "element2", { "element3": "is another k-v pair", "but_contains": [ "a", "list", "of", "words" ] } ], "this_value_is_a_string": "1", "and_this_is_a_number": 23, "and_floating_point": 1.413 } } So that the root dictionary has a single key-value pair and all the keys have no spaces in them (so that they can be made into xml tags).
Play
The Social-Engineer Podcast
The Social-Engineer Podcast
Social-Engineer, LLC
Ep. 138 – Security With Marcus Sailler of Capital Group
In this episode, Chris Hadnagy and Ryan MacDougall are joined by industry professional, Marcus Sailler to discuss his experience as the red team information security manager at Capital Group. Marcus shares some great tips on creating a successful security team and how you can prevent it from becoming the "No Police". They also go over the recent changes in the industry, including how big hacks have increased security awareness in the general public. 00:09 – Introduction to the new Security Awareness Series 01:28 – Introduction to Ryan MacDougall Phishing as a Service (PHaaS) Vishing as a Service (VaaS) Social-Engineer.com 02:32 – Introduction to Marcus Sailler 04:20 – How Marcus got into information security 06:08 – Recent changes in the infosec industry- How a big hack increases security awareness 12:09 – How a red team and security awareness team can collaborate to enhance security 14:25 – Introduction to Capital Group 16:17 – Coming up with relevant attacks for a global company 18:08 – How a security team can avoid becoming the “No Police” 21:39 – Why it’s better to build a blue team first 22:24 – The importance of attitude and ego for a red teamer 25:04 – How a red team benefits from partnership 26:53 – Emulate the bad guy, but remember to be good 29:18 – Steps corporations should implement now 30:58 – Some of Marcus’ most respected industry professionals Chris Hadnagy David McGuire Jason Frank Jeff Dimmock David Kennedy Amanda Berlin Ian Coldwater Rachel Tobac 34:47 – Marcus' book recommendations Sizing People Up: A Veteran FBI Agent's User Manual for Behavior Prediction The 5 Love Languages: The Secret to Love that Lasts 39:18 – Marcus' contact info LinkedIn Twitter 14:38 – Outro Social-Engineer.org Social-Engineer.com The Innocent Lives Foundation SEVillage: The Human Hacking Conference Human Hacking Book Website Human Hacking Book Amazon Clutch Chris on Twitter Social-Engineer on Twitter
44 min
More episodes
Search
Clear search
Close search
Google apps
Main menu