Sunday, May 19, 2019

How to take a Moneyball approach to business data and analytics

Businesses seeking a competitive advantage are increasingly implementing data and analytics tools, but can borrow lessons from the realm of sports to make these solutions an organizational capability, according to Ben Shields, senior lecturer at MIT Sloan School of Management.
At a basic level, analytics can offer organizations additional sources of information that can inform decisions. "The reason why that's important is that humans are naturally biased in a number of different ways," Shields said. "Having different sources of information and different perspectives at the table when making a decision can help a leader or an end user be aware of their biases and arrive at the best decision given the circumstances."
The primary example of analytics success in sports is the story told in the book and film Moneyball, about how the Oakland Athletics baseball team used an analytical approach to win games, despite a small budget.
Businesses are unequivocally more interested in working with data than ever before, Shields said.
"We are in an era where the amount of data is increasing exponentially, we have new advanced analytics tools to manage and analyze that data, and there is an increasing thirst for applying data to make better decisions," Shields said. "The challenge is, how do you actually apply data to make better decisions in organizations that are composed of people with a wide variety of different skillsets and proficiencies when it comes to understanding and applying analytics?"

Data lessons from sports

There are three main skillsets executives and their organizations need to make data-informed decisions:
1. Strategic: Developing a plan for how analytics can help the organization create and capture value
2. Technical: The ability to do analytics work in a rigorous, comprehensive way
3. Managerial: The ability to communicate data, use it in the decision-making process, and track a decision with key metrics to then iterate upon it and improve it.
More people are getting trained as data scientists, and as such, organizations are increasingly growing strong capabilities in terms of technical skillsets, Shields said. However, businesses still have room to grow in terms of the strategic and managerial skillsets, he added.
"Interestingly, these are both very human skillsets—the ability to think strategically about how you are going to use data to create value at your organization is a very human process, as is the ability to integrate data into decision-making processes throughout an organization," Shields said. "Those are two areas where I see executives constantly being challenged despite some of the progress."
On the strategy side, one useful lesson from sports in this area is the clarity and simplicity of a team's goals: To win games.
"A very clear goal like that sets up the analytics work to help the team achieve it," Shields said. "A much more focused goal can help give the analytics work more direction and meaning."
One of the major points of failure in data projects today is when businesses have a bunch of smart people working in a silo, and insights aren't shared with decision-makers across the business, Shields said.
"If executives are more clear on what goals they are trying to achieve and how they can use data and analytics to achieve those goals, then I think there's going to be a higher adoption and usage of analytics on a more consistent basis," he added.
In terms of managerial skills, Shields pointed to the example of behavioral modeling: On a baseball team, a coach can model using data to inform game strategies for player. Similarly, an executive can be transparent about how they are using data to improve decision-making, and that behavior will trickle down to the rest of the organization.
Executives must also commit to doing data-based training and reskilling in-house, to gain strong functional and institutional knowledge of data science within the organization.
"Making a commitment to training in-house to reskilling the workforce is also something that can help close the gap between analytics work that's just sitting on a spreadsheet and analytics work that actually makes an impact in decision making," Shields said.

How to communicate data findings to end users

Data needs to be presented and shared in a way that employees can actually it apply to their work, Shields said. "The end users need to be able to see how data can make their jobs better—in sports, we're seeing increasingly that the athletes are more willing to embrace analytics because it's helping them perform better on the court floor or pitch," he added.
A key part of communicating this is answering the fundamental question of what's in it for them, Shields said. In sports, if you share a data-driven insight about how a basketball player's form is leading to misses, that player will pay attention because the information can help their performance. In a business context, if an employee is charged with client satisfaction or process improvements, and data is shared in a way that makes clear how they will benefit, they will be more likely to act on it, Shields said.
It's also important to keep information as simple as possible when sharing it, Shields said. "It might be tempting to use animations and cutting-edge data visualizations, but that could also muddy the message and confuse the value that an end user might be able to obtain," he added.

Getting started with data in your business

In terms of getting started, Shields recommends choosing one business problem an organization has, and breaking it down with the following questions:
  • What data do you need to solve that problem?
  • What people do you need to do the analytics work?
  • What methods will you use to do the analytics work?
  • What technology do you need to do the analytics work?
  • How will the insights of the analytics work be communicated?
  • How will the insights of the analytics work be used in the decision making process?
  • How will the decision be tracked, and with what metrics?
  • How will the results of the decision inform a new business problem to work on?
"There's a systematic thought process here that is not rocket science by any means, but can help even non-technical people wrap their heads around data-driven decision making," Shields said.
For more, check out TechRepublic's Cheat sheet: How to become a data scientist.

OSINT: How to find information on anyone

OSINT: How to find information on anyone

Your data is more exposed than you think

Open Source Intelligence (OSINT) — is the information gathering from the publicly available sources and its analysis to produce an actionable intelligence. The scope of OSINT is not limited to cybersecurity only but corporate, business and military intelligence or other fields where information matters.
Whether you are a recruiter, marketing manager, cybersecurity engineer or just a curious one, reading the article, you will find something useful for yourself. Maybe you want to know what data of yours is out there for others to find or just want to see if the person or the organization that contacted you online is legit. In this article, I will explain how to discover a person’s digital footprint, perform digital investigations and gather information for competitive intelligence or penetration testing.
Many OSINT tools are available nowadays so I’m not going to cover them all, only the most popular ones and those useful in the described use cases. In this guide, I showed a general approach, different tools and methods that you can use depending on the requirements and the initial data you have.

Basic OSINT steps

  1. Start with what you know (email, username, etc.)
  2. Define requirements (what you want to get)
  3. Gather the data
  4. Analyze collected data
  5. Pivot as-needed using new gathered data
  6. Validate assumptions
  7. Generate report

Real name OSINT Workflow Chart: Real Name

Governmental resources

There are dozens of websites where you can find information about people or organizations and depending on the country information openness can be different. I’m not going to write about it in details as the governmental resources I could have provided might have not been relevant for you, as a resident of a different country. Just remember that such resources exist and Google them in need, they are not that hard to find, especially using the advanced search queries I described below.

Google Dorks

In 2002, Johnny Long began collecting Google search queries that uncovered vulnerable systems or sensitive information disclosures. He labeled them Google Dorks. Since the article is about legally obtained information I’m not going to show how to get an unauthorized access, however, you can explore Google Hacking Database with a thousand of different queries. The queries below can return information that is difficult to locate through simple a search.
  • “john doe” — quotation marks force Google Search to do absolutely exact match while the search is performed on Instagram.
  • “john doe” -“” — hide postings from the target’s own account, but show posted comments on the Instagram posts of others.
  • “john” “doe” — show results that exactly match given name and surname but in different combinations and exclude Instagram from results.
  • CV” OR “Curriculum Vitae” filetype:PDF “john” “doe” — search for target’s resumes that contain “CV” or “Curriculum Vitae” in the name and have a PDF extension.
Wrap single words in quotes if you are 100% sure about spelling as by default Google will try to shape your keyword to what the masses want. By the way, what’s interesting about Instagram is with the right Google Dork you can see comments and likes of private accounts.
Perform a search using advanced search queries on Bing, Yandex and DuckDuckGo as other search engines might give you results that Google couldn’t.

People search

There are websites that specialize in people search which can be done providing a real name, username, email or phone number.
People search websites allow to opt out but after people remove themselves from listings new search services appear with their records in them. The reason for that is the same dataset is bought and used by different services. Some companies own those datasets and even if on one of their websites person removes the listing, on the new domain the old data is repopulated again so previously removed profile reappears in the search. Consequently, if people did a pretty good at cleaning their stuff up you have to just to wait for a new database to appear. One of the methods to find people that opted out is to go the people search service, find a unique paragraph, do quoted Google search on it and find all of the domains that the company owns. There are chances that information your target removed from site A is now on site B.

User name OSINT Workflow Chart: User Name
Firstly, we have to find a username. Usually, it is a name plus surname combination or derived from the email, domain name of the website the person uses or owns. Start with data you have and do a reverse lookup towards what you need. Obviously, the simplest way is to Google any relevant data known to you at the moment and try to find any pages with the username. Also, you can use special websites that do a reverse username search, like or

Google Dorks

The same Google Dorks that I showed for the real name search will be useful when searching for a username. In addition, URL search might give you good results as usually URLs contain usernames.
  • inurl:johndoe—search for URLs on Instagram that contain “johndoe” in them.
  • allinurl:john doe ny — find pages with “john“, “doe” and “ny” words in the Instagram URL. Similar to inurl but supports multiple words.
Depending on the complexity of your search and how successful it was using previous methods you might want to generate a wordlist. It’s useful when you need to try a lot of options as you don’t have a clear picture of what username should be but have a lot of guesses. I have used the Python script for generating the wordlist below:
Name and surname were specified in Names.txt, in Terminal we just see the output

Username search

There are a lot of websites with a username search, I find these to be one of the best: and Usually, one service finds accounts that other one doesn’t so it’s better to use both websites. Apart from online services you can use WhatsMyName — a Github project, included in more advanced tools: Spiderfoot and Recon-ng. However, you can use it as a standalone checker running a Python script.
Searching for “johndoe” username on 152 sites with WhatsMyName
Searching you might get false positives as someone else can use the same username, be prepared for that.
Note: Running WhatsMyName, as well as any locally installed tool, could be an issue when you have certain websites blocked by the ISP. In that case, going through proxy or VPN will solve the issue. Moreover, to avoid exposure you should use anonymizers anyway.

Email Address OSINT Workflow Chart: Email Address

Google Dorks

  • “” — search for all emails on a given domain.
  • HR “email” filetype:csv | filetype:xls | filetype:xlsx — find HR contact lists on a given domain.
  • filetype:xls — extract email IDs from Google on a given domain.

Email tools

  • Hunter — performs fast scan of the domain name for email addresses and reveals its common pattern.
  • Email permutator — generates permutations of up to three domains at which target is likely to have an email address. Supports multiple variables input to generate custom results.
  • Proofy — allows bulk email validation which is useful when you generated a list of emails using permutation tool and want to check all of them at once.
  • Verifalia — validates single email addresses for free without registration. To use bulk validation you have to sign up.

Browser plugins

  • Prophet — reveals more information about people. It uses an advanced engine to predict the most likely email combination for a given person based on name, company and other social data. Then, Prophet verifies the generated email to make sure it is correct and deliverable.
  • OSINT browser extension — contains a lot of useful links, including ones for email search and verification. Compatible with Firefox and Chrome.
  • LinkedIn Sales Navigator — plugin for Chrome that shows associated Twitter account and rich LinkedIn profile data directly in Gmail.

Compromised databases

Data breaches have become a big issue and recently we are seeing more and more data dumps. Security researcher, Troy Hunt, collected released data, stripped off passwords, assigned emails to the breach they were involved in and uploaded it to While the fact of the breach itself might not be as important, what’s important is knowing the email of the person you are researching you might get a list of services that person uses or at least used.
Another option would be to use With a free account it works similarly to Troy Hunt’s website but with the active subscription it shows passwords in clear text or password hashes. From an OSINT perspective, we need that to search whether it was used on some other websites — one more way to find out which services that the person uses or at least used. Doing the search by password or its hash shows not only on which website it was used, but also email address tied to it. Thus, we can get target’s emails we wouldn’t obtain otherwise. It’s important to note that if the password is not unique we might get false positives as other people might use it as well.

Phone number OSINT Workflow Chart: Telephone #
Sometimes people link phone number and email to their Facebook profile, so typing it in the Facebook search might show you the profile. Another option is to look up user-supplied databases of phone numbers, like The database is not limited only to America, numbers from Europe can be checked as well. Besides, for those who want something like this, but on the mobile device there are several apps:, and There are many reverse phone lookup services and they are usually country-specific so find the one that fits your need.


PhoneInfoga is one of the most advanced tools to scan phone numbers using only free resources. The goal is to first gather basic information such as country, area, carrier and line type on any international phone numbers with very good accuracy. Then try to determine the VoIP provider or search for footprints on search engines to try to identify the owner.
  • Check if phone number exists and is possible
  • Gather standard information such as country, line type and carrier
  • Check several numbers at once
  • OSINT reconnaissance using external APIs, Google Hacking, phone books & search engines
  • Use custom formatting for more effective OSINT reconnaissance
Well, you can see how many resources were scanned. Definitely faster than manual search.

Android Emulator

Many Android apps will work on an emulator without problems but some might not work as expected. For example, Viber has issues with activating with VoIP phone number, as tested on However, there are many advantages running apps on the emulator: your real accounts or phone number will be safe as you don’t have to install questionable apps on your phone and you can easily spoof GPS coordinates.
Save the number in your phone and look at Viber or WhatsApp contact list. These services allow adding a photo, biography, name of the owner and this information can be extracted just by knowing the telephone number.
  • Bluestacks — made primarily for gamers but runs other apps as well. Available for Windows, Mac and Linux and doesn’t require a Virtual Machine to set it up so it installs easier than Genymotion.
  • Genymotion — widely used by developers but also has a free version for personal use. Works on Windows, Mac and Linux and has a range of virtual devices to choose from. Use this guide from IntelTechniques to set up the emulator.
  • AMIDuOS — available only for Windows and leverages device drivers from the system to enable near-native performance in Android. It’s fast and has a straightforward installation. However, while the aforementioned emulators can be installed for free, AMIDuOS comes at a price of $10.

Domain name OSINT Workflow Chart: Domain Name
If the person or an organization owns a website you have to know how to grab information about it. Its investigation might reveal the operating system being used, software version, personal contact info and more. I have to mention that it is advised to investigate without ever ‘touching’ the target’s environment, such technique is called passive reconnaissance — footprinting that involves the uses of tools and resources that can assist in obtaining more information about your target without directly interacting with it. Below I described methods of obtaining information while remaining stealthy.

Google Dorks

Google Dorks is a passive information gathering method that was already mentioned above. Here I’m going to show what queries might be useful during domain investigation.
  • — limits search to a particular website or domain.
  • filetype:DOC — returns DOC files or other specified types, such as PDF, XLS and INI. Multiple file types can be searched for simultaneously by separating extensions with “|”.
  • intext:word1 — search for pages & websites that contain the specific word that you are searching.
  • allintext: word1 word2 word3 — search for all the given words in a page or website.
  • — will list web pages that are “similar” to a specified web page.
  • site:* — show all subdomains. Asterisk acts as a substitute for a whole word or words in search queries.


Whois provides information about the registered users or assignees of an Internet resource, such as a Domain name, an IP address block, or an autonomous system. There are many Whois resources, these are good and

Reverse Whois

Reverse Whois gives you a list of domains that have the same organization name or email address in their Whois record. For example, if you are investigating a company with the name “John Doe Inc” you can see all the other domains registered under the “John Doe Inc”. One of my favourite websites is as it has an extensive toolkit, including reverse whoislookups.

Same IP

Often discovering what site is running on the same server as your target’s website uncovers valuable information. For example, you might find sub-domains or development sites. Often the service provider who hosts this site is responsible for other services as well, use and to check it.

Passive DNS

Using only DNS records you can see what IP resolved to the name or what name resolved to the IP. Sometimes that is not enough and that’s where passive DNS records come in handy. They allow to check all the names that resolved to the researched IP, thus you can build a useful history of resolutions. My favourite product is RiskIQ Community Edition because it gives more information than just passive DNS. VirusTotal or SecurityTrails can be used for that purposes as well.

Internet archives and cache

The WaybackMachine can be used to find previous versions of webpages, enabling one to see how websites looked earlier or to recover deleted pages. is another time capsule for web pages with the ability to manually add live url snapshots to the archive.
There are cases when deleted pages were not archived but are still cached by search engines. They can be found on or you can request the cached version with the following Google query: Didn’t find anything on Google? Check the cache of other search engines but keep in mind that the cache shows the last time the page was indexed. Therefore, you might get the page with missing images and outdated information.
You may also like — a monitoring service that takes screenshots of the webpage at the selected time and sends you an email alert if something changes.

Reputation, malware and referrals analysis

Reputation is important to know with whom you are dealing with and whether the website can be trusted. In case of any suspicions, malicious activity check using free online tools might save you the trouble of opening the website in the VM or going through other precaution steps. Referral analysis is a search for inward and outward HTML links. Although doing the test on its own is not going to get you precise results, still, it’s one of the methods that might show you connected domains.
  • — analyses website traffic (users, page views) and estimates how much revenue it could generate through ads.
  • —analyses website traffic and it’s competitors, shows what they are doing better and gives advice on SEO improvement.
  • — analytics tool which provides deep information on website or mobile ranking, performance, the source of traffic, and more. On top of that, it does referral analysis.
  • — scans websites for known malware, blacklisting status, website errors, and out-of-date software.
  • — offers free malware scanning and provides a comprehensive report that includes malicious files, suspicious files, blacklisted status and more.
  • — helps you detect potentially malicious websites. Also, it gives more information about the domain (IP address, DNS records, etc.) and cross-references it against known blacklists.

IoT search engines

IoT (Internet of Things) search engines show you devices connected to the cyberspace, think Google Search but for Internet-connected devices. Why is that useful? Instead of actively scanning ports and services with Nmap, for example, you can request already available information about open ports, applications, and protocols. is the most popular internet scanner with public API and integration with many security tools. For marketers, it provides data about product users and where they are located. Security researchers use it to uncover vulnerable systems and get access to a wide range of IoT devices. There are other alternatives like Censys, or it’s Chinese analogies — Fofa and ZoomEye.

Location search OSINT Workflow Chart: Location

Geolocation tools

Creepy is a free tool that allows data gathering from social networks and image hosting services for location research. The commercial option would be Echosec that costs nearly $500 per month, but if you are a nonprofit or a freelance journalist, you can apply for access to Echosec Pro at no cost.

IP-based Geolocation

IP-based Geolocation is a mapping of the IP address or MAC address to the real-world geographic location. There are many websites that map IP address to location, one of them is When you know WI-FI access points the person has previously connected to — use to map them and do more detailed research on Google Earth.

Useful websites

  • — database of architecture, provides images of buildings from all over the world. Might be useful to determine what building is on the picture.
  • — search for geotagged public posts VKontakte and filter them by date.
  • — allows to search geotagged VKontakte posts, as a previous service, but requires authorization.
  • — the global network of owned and operated live streaming webcams which might be useful during location research.
  • — a directory of online security cameras. The coordinates of the cameras are approximate and point to the ISP address and not the physical address of the camera.


When you have a picture and want to know where else it is used or when did it first appear — do the reverse image search using Google ImagesBing Images, and Baidu Images. In addition, TinEye’s algorithms are designed differently than Google’s and as such can return different results. Why is that useful? As an investigator, you may find the person by avatar as people usually don’t bother changing profile pictures for the various social networks they use. As a journalist, you may perform an image search paired with filtering to expose fake news. For example, a picture taken on the day of the event searched with date filter range that is earlier than the described event can’t be found earlier. Thus, if the image is found — it was created before the event, therefore it’s fake. If you need a narrow search across the social network Findclone and for Vkontakte or karmadecay for Reddit will do the job. Also, its worth to mention browser extensions: RevEyefor Chrome and Image Search Options for Firefox. Mobile apps like CamFindfor iOS might help you searching for things from the physical world. Moreover, there is an Image Identification Project to identify what’s on the image using AI.
The image itself contains a lot of useful information, like the camera information, geocordinates, etc. — it’s called EXIF data and if it wasn’t removed you might find a lot of interesting info. For example, map geocordinates to find out where the picture was taken or get camera serial number and look if there are pictures taken with that camera on the internet, there is a special service for that — Image editing tools allow to view metadata, if you don’t need to install a complex program, Exiftool — the cross-platform free software might the thing you are looking for. The third option is to view EXIF data online: or To remove EXIF data you can use a locally installed tool: or do it online:
Do you need to perform image forensics and find out if the image was tampered with? Use Forensically or FotoForensics online tools. If you don’t want to upload an image online — Phoenix or Ghiro can be run locally. The latter is more automated and gives you more functionality than the above mentioned online tools. Apart from that, working with images you might need to deblur it or improve the quality, so here are some enhancement tools:
  • Smartdeblur — restores motion blur and removes Gaussian blur. Helps to restore focus and do image improvements which deliver amazing results.
  • Blurity — focuses only on deblurring images, doesn't provide such many options like the previous tool and available only on Mac.
  • — enhance and upscale images online using AI.


SOCMINT is a subset of OSINT that concentrates on data gathering and monitoring on social media platforms. I have already described some social media intelligence techniques, here I will complete the picture by listing more of useful tools.


  • Stalkscan — displays all publicly accessible information about the person.
  • ExractFace — extracts data from Facebook, making it available offline to use as evidence or perform advanced offline analysis.
  • Facebook Sleep Stats — estimates sleeping patterns based on users online/offline status.
  • — helps you to find the Facebook ID for a profile or a group.


  • Twitter advanced search — well, that’s pretty self-explanatory :)
  • TweetDeck — gives you a dashboard that displays separate columns of activity from your Twitter accounts. For example, you might see separate columns for your home feed, your notifications, your direct messages, and your activity — all in one place on the screen.
  • Trendsmap — shows you the most popular trends, hashtags, and keywords on Twitter from anywhere around the world.
  • Foller — gives you rich insights about any public Twitter profile (profile public information, number of tweets and followers, topics, hashtags, mentions).
  • Socialbearing — free Twitter analytics & search for tweets, timelines & twitter maps. Finds, filters, and sorts tweets or people by engagement, influence, location, sentiment and more.
  • Sleepingtime — shows the sleeping schedule of Twitter public accounts.
  • Tinfoleak — shows devices, operating systems, applications and social networks used by the Twitter user. Also, it shows places and geolocation coordinates to generate a tracking map of locations visited. Maps user tweets in Google Earth and more.



  • InSpy — an enumeration tool is written in Python. Can be used to search for employees of a specific organization. Additionally, it can find out what technologies the organization uses, which is done by a crawling job listing for specific keywords.
  • LinkedInt — scrapes e-mail addresses of employees in a selected organization. Supports automated e-mail prefix detection for a given company domain name.
  • ScrapedIn — a Python script that scrapes profile data and imports it into XLSX file (intended to be used with Google Spreadsheets).

Automating OSINT

Internet is the ocean of data and looking for the information manually might be time-consuming and not effective, plus automated tools could make correlation you wouldn’t spot otherwise. It all depends on your case, whether you need to use these tools or not, as most of them have a steep learning curve and are required to solve complex problems. Thus, if you need to accomplish several simple tasks — don’t bother installing software, just use online services and standalone scripts I have described earlier. To save some time and have an investigative environment ready, with all of these described below tools installed (excerpt FOCA), you can download Buscador OS — Linux Virtual Machine that is pre-configured for OSINT.


SpiderFoot is one of the best reconnaissance tools out there if you want to automate OSINT as it can be used to query more than 100 public data sources simultaneously and its modularity allows to fine-tune queried sources. What I personally liked is scanning by use cases. There are four different use cases: get everything and everything about the target, understand what your target exposes to the Internet (done through web crawling and search engine use), query blacklists and other sources to check target’s maliciousness and gather intelligence through different open sources. The last one is ideal for passive reconnaissance.


theHarvester is a very simple, yet effective tool used to fetch valuable information about the target on information gathering stage. It is great for scanning domain related information and harvesting emails. For passive reconnaissance, theHarvester uses many resources to fetch the data like Bing, Baidu, Yahoo and Google search engine, and also social networks like LinkedIn, Twitter and Google Plus. For active reconnaissance, it does DNS reverse lookup, DNS TDL expansion, and DNS brute force.


Recon-ng is another great command line tool used to perform information gathering thoroughly and quickly. This full-featured Web Reconnaissance framework includes a good selection of modules for passive reconnaissance, convenience functions and interactive help to guide you on how to use it properly. For those familiar with Metasploit, Recon-ng will be easier to learn as it has similar usage model. If you are looking for something powerful that can quickly check the visibility of your company on the Internet — this is a tool to go.


Maltego is an advanced platform developed for analyzing complex environments. Apart from data mining, it does data correlation and visually presents it. Maltego works with entities (people, companies, web sites, documents and more) which you connect for further information gathering about them from different sources to get meaningful results. The distinctive feature of this tool is “transforms” — a library of plugins that help to run a different kind of tests and data integrations with external applications.


FOCA (Fingerprinting Organizations with Collected Archives) is a tool for extracting hidden information and metadata from analyzed documents. When all documents are analyzed and metadata extracted it does automated metadata analysis to establish which documents were created by the same user. Also, it does correlation by server and printer. The latest version is available only for Windows.


Metagoofil is a command line tool that is used to download public documents from websites with the following analysis and metadata extraction. It works with pdf, doc, xls, ppt, and other formats.


To conclude, it’s hard to stay private in the post-privacy world and control what information is floating in this digital ocean. While you can’t control everything that’s out there about you, it’s important to be at least aware about it. It goes without saying, that in the digital age information plays the key role so those who know how to find it will always be one step ahead. That’s what this article is for, to show how OSINT helps to solve a broad range of problems: from marketing to investigations and cybersecurity. However, I have described the only tip of the iceberg and most techniques in the article are simple but still powerful. Therefore, some of the techniques when used in a malicious purpose might cause damage so I expect you will use them sensibly.
While this article was more about intelligence gathering, the next one will be about the preparation phase. Let me know in the comments if there is something specific you want to know about preparing an investigative environment. By the way, what tools and techniques do you use to gather intelligence?