/CrossLinked

LinkedIn enumeration tool to extract valid employee names from an organization through search engine scraping

Primary LanguagePythonGNU General Public License v3.0GPL-3.0

CrossLinked

CrossLinked is a LinkedIn enumeration tool that uses search engine scraping to collect valid employee names from an organization. This technique provides accurate results without the use of API keys, credentials, or accessing LinkedIn directly!

Table of Contents

• Scraping Public profiles are battle tested in court in HiQ VS LinkedIn case.
• GDPR, CCPA, SOC2 compliant
• High rate limit - 300 requests/minute
• Fast - APIs respond in ~2s
• Fresh data - 88% of data is scraped real-time, other 12% are not older than 29 days
• High accuracy
• Tons of data points returned per profile

Built for developers, by developers.


Install

Install the last stable release from PyPi:

pip3 install crosslinked

Or, install the most recent code from GitHub:

git clone https://github.com/flurryunicorn/crosslinked
cd crosslinked
python3 setup.py install

Prerequisites

CrossLinked assumes the organization's account naming convention has already been identified. This is required for execution and should be added to the CMD args based on your expected output. See the Naming Format and Example Usage sections below:

Naming Format

{first.{last}           = john.smith
CMP\{first}{l}          = CMP\johns
{f}{last}@company.com   = jsmith@company.com

🦖 Still Stuck? Metadata is always a good place to check for hidden information such as account naming convention. see PyMeta for more.


Advanced Formatting

💥 New Feature 💥

To be compatible with alternate naming conventions CrossLinked allows users to control the index position of the name extracted from search text. Should the name not be long enough, or errors encountered with the search string, CrossLinked will revert back to its default format.

Note: the search string array starts at 0. Negative numbers can also be used to count backwards from the last value.

# Default output
crosslinked -f '{first}.{last}@company.com' Company
John David Smith = john.smith@company.com

# Use the second-to-last name as "last"
crosslinked -f '{0:first}.{-2:last}@company.com' Company
John David Smith    = john.david@company.com
Jane Doe            = jane.doe@company.com

# Use the second item in the array as "last"
crosslinked -f '{first}.{1:last}@company.com' Company
John David Smith    = john.david@company.com
Jane Doe            = jane.doe@company.com

Search

By default, CrossLinked will use google and bing search engines to identify employees of the target organization. After execution, two files (names.txt & names.csv) will appear in the current directory, unless modified in the CMD args.

  • names.txt - List of unique user accounts in the specified format.
  • names.csv - Raw search data. See the Parse section below for more.

Example Usage

python3 crosslinked.py -f '{first}.{last}@domain.com' company_name
python3 crosslinked.py -f 'domain\{f}{last}' -t 15 -j 2 company_name

⚠️ For best results, use the company name as it appears on LinkedIn "Target Company" not the domain name.

Screenshots

Parse

Account naming convention changed after execution and now your hitting CAPTCHA requests? No Problem!

CrossLinked includes a names.csv output file, which stores all scraping data including: name, job title, and url. This can be ingested and parsed to reformat user accounts as needed.

Example Usage

python3 crosslinked.py -f '{f}{last}@domain.com' names.csv

Screenshots

Additional Options

Proxy Rotation

The latest version of CrossLinked provides proxy support to rotate source addresses. Users can input a single proxy with --proxy 127.0.0.1:8080 or use multiple via --proxy-file proxies.txt.

> cat proxies.txt
127.0.0.1:8080
socks4://111.111.111.111
socks5://222.222.222.222

> python3 crosslinked.py --proxy-file proxies.txt -f '{first}.{last}@company.com' -t 10 "Company"

⚠️ HTTP/S proxies can be added by IP:Port notation. However, socks proxies will require a socks4:// or socks5:// prefix.*

Command-Line Arguments

positional arguments:
  company_name        Target company name

optional arguments:
  -h, --help          show help message and exit
  -t TIMEOUT          Max timeout per search (Default=15)
  -j JITTER           Jitter between requests (Default=1)

Search arguments:
  --search ENGINE     Search Engine (Default='google,bing')

Output arguments:
  -f NFORMAT          Format names, ex: 'domain\{f}{last}', '{first}.{last}@domain.com'
  -o OUTFILE          Change name of output file (omit_extension)

Proxy arguments:
  --proxy PROXY       Proxy requests (IP:Port)
  --proxy-file PROXY  Load proxies from file for rotation

Contribute

Contribute to the project by:

  • Like and share the tool!
  • Create an issue to report any problems or, better yet, initiate a PR.