Codebase list cloud-enum / ac681ab
Merge new upstream release 0.6. Kali Janitor 3 years ago
10 changed file(s) with 393 addition(s) and 92 deletion(s). Raw diff Collapse all Expand all
33 Currently enumerates the following:
44
55 **Amazon Web Services**:
6 - Open S3 Buckets
7 - Protected S3 Buckets
6 - Open / Protected S3 Buckets
7 - awsapps (WorkMail, WorkDocs, Connect, etc.)
88
99 **Microsoft Azure**:
1010 - Storage Accounts
1414 - Web Apps
1515
1616 **Google Cloud Platform**
17 - Open GCP Buckets
18 - Protected GCP Buckets
17 - Open / Protected GCP Buckets
18 - Open / Protected Firebase Realtime Databases
1919 - Google App Engine sites
20 - Cloud Functions (enumerates project/regions with existing functions, then brute forces actual function names)
2021
21 By "open" buckets/containers, I mean those that allow anonymous users to list contents. if you discover a protected bucket/container, it is still worth trying to brute force the contents with another tool.
22
23 **IMPORTANT**: Azure Virtual Machine DNS records can span a lot of geo regions. To save time scanning, there is a "REGIONS" variable defined in cloudenum/azure_regions.py. You'll want to look at this file and edit it to be relevant to your own work.
22 See it in action in [Codingo](https://github.com/codingo)'s video demo [here](https://www.youtube.com/embed/pTUDJhWJ1m0).
2423
2524 <img src="https://initstring.keybase.pub/host/images/cloud_enum.png" align="center"/>
2625
2827 # Usage
2928
3029 ## Setup
31 You'll need the `requests-futures` python package, as this tool uses it for multi-threading HTTP requests. It's a very cool package if you're already using `requests`, I highly recommend it.
30 Several non-standard libaries are required to support threaded HTTP requests and dns lookups. You'll need to install the requirements as follows:
3231
3332 ```sh
3433 pip3 install -r ./requirements.txt
3938
4039 You can provide multiple keywords by specifying the `-k` argument multiple times.
4140
42 Azure Containers required two levels of brute-forcing, both handled automatically by this tool. First, by finding valid accounts (DNS). Then, by brute-forcing container names inside that account (HTTP scraping). The tool uses the same fuzzing file for both by default, but you can specificy individual files separately if you'd like.
41 Keywords are mutated automatically using strings from `enum_tools/fuzz.txt` or a file you provide with the `-m` flag. Services that require a second-level of brute forcing (Azure Containers and GCP Functions) will also use `fuzz.txt` by default or a file you provide with the `-b` flag.
4342
4443 Let's say you were researching "somecompany" whose website is "somecompany.io" that makes a product called "blockchaindoohickey". You could run the tool like this:
4544
4746 cloudenum.py -k somecompany -k somecompany.io -k blockchaindoohickey
4847 ```
4948
50 DNS brute-forcing uses a hard-coded 25 threads, leveraging subprocess and the Linux `host` command.
51
52 HTTP scraping uses 5 threads by default. You can try increasing this, but eventually the cloud providers will rate limit you. Here is an example to increase to 10.
49 HTTP scraping and DNS lookups use 5 threads each by default. You can try increasing this, but eventually the cloud providers will rate limit you. Here is an example to increase to 10.
5350
5451 ```sh
5552 cloudenum.py -k keyword -t 10
5653 ```
54
55 **IMPORTANT**: Some resources (Azure Containers, GCP Functions) are discovered per-region. To save time scanning, there is a "REGIONS" variable defined in `cloudenum/azure_regions.py and cloudenum/gcp_regions.py` that is set by default to use only 1 region. You may want to look at these files and edit them to be relevant to your own work.
5756
5857 **Complete Usage Details**
5958 ```
6867 -kf KEYFILE, --keyfile KEYFILE
6968 Input file with a single keyword per line.
7069 -m MUTATIONS, --mutations MUTATIONS
71 Mutations. Default: cloud_enum/mutations.txt.
70 Mutations. Default: enum_tools/fuzz.txt
7271 -b BRUTE, --brute BRUTE
7372 List to brute-force Azure container names. Default:
74 cloud_enum/brute.txt.
73 enum_tools/fuzz.txt
7574 -t THREADS, --threads THREADS
7675 Threads for HTTP brute-force. Default = 5
7776 -ns NAMESERVER, --nameserver NAMESERVER
8180 --disable-aws Disable Amazon checks.
8281 --disable-azure Disable Azure checks.
8382 --disable-gcp Disable Google checks.
83 -qs, --quickscan Disable all mutations and second-level scans
84
8485 ```
8586
8687 # Thanks
7777
7878 parser.add_argument('--disable-gcp', action='store_true',
7979 help='Disable Google checks.')
80
81 parser.add_argument('-qs', '--quickscan', action='store_true',
82 help='Disable all mutations and second-level scans')
8083
8184 args = parser.parse_args()
8285
127130 Print a short pre-run status message
128131 """
129132 print("Keywords: {}".format(', '.join(args.keyword)))
130 print("Mutations: {}".format(args.mutations))
133 if args.quickscan:
134 print("Mutations: NONE! (Using quickscan)")
135 else:
136 print("Mutations: {}".format(args.mutations))
131137 print("Brute-list: {}".format(args.brute))
132138 print("")
139
140 def check_windows():
141 """
142 Fixes pretty color printing for Windows users. Keeping out of
143 requirements.txt to avoid the library requirement for most users.
144 """
145 if os.name == 'nt':
146 try:
147 import colorama
148 colorama.init()
149 except ModuleNotFoundError:
150 print("[!] Yo, Windows user - if you want pretty colors, you can"
151 " install the colorama python package.")
133152
134153 def read_mutations(mutations_file):
135154 """
192211 # Generate a basic status on targets and parameters
193212 print_status(args)
194213
195 # First, build a sort base list of target names
196 mutations = read_mutations(args.mutations)
214 # Give our Windows friends a chance at pretty colors
215 check_windows()
216
217 # First, build a sorted base list of target names
218 if args.quickscan:
219 mutations = []
220 else:
221 mutations = read_mutations(args.mutations)
197222 names = build_names(args.keyword, mutations)
198223
199224 # All the work is done in the individual modules
0 cloud-enum (0.6-0kali1) UNRELEASED; urgency=low
1 -- Kali Janitor <[email protected]> Wed, 07 Apr 2021 02:15:07 -0000
2
03 cloud-enum (0.2-0kali1) kali-dev; urgency=medium
14
25 [ Joseph O'Gorman ]
1212
1313 # Known S3 domain names
1414 S3_URL = 's3.amazonaws.com'
15 APPS_URL = 'awsapps.com'
1516
1617 # Known AWS region names. This global will be used unless the user passes
1718 # in a specific region name. (NOT YET IMPLEMENTED)
8687 # Stop the time
8788 utils.stop_timer(start_time)
8889
90 def check_awsapps(names, threads, nameserver):
91 """
92 Checks for existence of AWS Apps
93 (ie. WorkDocs, WorkMail, Connect, etc.)
94 """
95 print("[+] Checking for AWS Apps")
96
97 # Start a counter to report on elapsed time
98 start_time = utils.start_timer()
99
100 # Initialize the list of domain names to look up
101 candidates = []
102
103 # Initialize the list of valid hostnames
104 valid_names = []
105
106 # Take each mutated keyword craft a domain name to lookup.
107 for name in names:
108 candidates.append('{}.{}'.format(name, APPS_URL))
109
110 # AWS Apps use DNS sub-domains. First, see which are valid.
111 valid_names = utils.fast_dns_lookup(candidates, nameserver,
112 threads=threads)
113
114 for name in valid_names:
115 utils.printc(" App Found: https://{}\n" .format(name), 'orange')
116
117 # Stop the timer
118 utils.stop_timer(start_time)
119
89120 def run_all(names, args):
90121 """
91122 Function is called by main program
96127 #if not regions:
97128 # regions = AWS_REGIONS
98129 check_s3_buckets(names, args.threads)
99 return ''
130 check_awsapps(names, args.threads, args.nameserver)
7373 candidates.append('{}.{}'.format(name, BLOB_URL))
7474
7575 # Azure Storage Accounts use DNS sub-domains. First, see which are valid.
76 valid_names = utils.fast_dns_lookup(candidates, nameserver)
76 valid_names = utils.fast_dns_lookup(candidates, nameserver,
77 threads=threads)
7778
7879 # Send the valid names to the batch HTTP processor
7980 utils.get_url_batch(valid_names, use_ssl=False,
100101
101102 # Stop brute forcing accounts without permission
102103 if ('not authorized to perform this operation' in reply.reason or
103 'not have sufficient permissions' in reply.reason):
104 print(" [!] Breaking out early, auth errors.")
104 'not have sufficient permissions' in reply.reason or
105 'Public access is not permitted' in reply.reason or
106 'Server failed to authenticate the request' in reply.reason):
107 print(" [!] Breaking out early, auth required.")
105108 return 'breakout'
106109
107110 # Stop brute forcing unsupported accounts
149152 valid_accounts.append(account)
150153
151154 # Read the brute force file into memory
152 with open(brute_list, encoding="utf8", errors="ignore") as infile:
153 names = infile.read().splitlines()
154
155 # Clean up the names to usable for containers
156 banned_chars = re.compile('[^a-z0-9-]')
157 clean_names = []
158 for name in names:
159 name = name.lower()
160 name = banned_chars.sub('', name)
161 if 63 >= len(name) >= 3:
162 if name not in clean_names:
163 clean_names.append(name)
155 clean_names = utils.get_brute(brute_list, mini=3)
164156
165157 # Start a counter to report on elapsed time
166158 start_time = utils.start_timer()
195187 utils.printc(" Registered Azure Website DNS Name: {}\n"
196188 .format(hostname), 'green')
197189
198 def check_azure_websites(names, nameserver):
190 def check_azure_websites(names, nameserver, threads):
199191 """
200192 Checks for Azure Websites (PaaS)
201193 """
209201
210202 # Azure Websites use DNS sub-domains. If it resolves, it is registered.
211203 utils.fast_dns_lookup(candidates, nameserver,
212 callback=print_website_response)
204 callback=print_website_response,
205 threads=threads)
213206
214207 # Stop the timer
215208 utils.stop_timer(start_time)
222215 utils.printc(" Registered Azure Database DNS Name: {}\n"
223216 .format(hostname), 'green')
224217
225 def check_azure_databases(names, nameserver):
218 def check_azure_databases(names, nameserver, threads):
226219 """
227220 Checks for Azure Databases
228221 """
236229
237230 # Azure databases use DNS sub-domains. If it resolves, it is registered.
238231 utils.fast_dns_lookup(candidates, nameserver,
239 callback=print_database_response)
232 callback=print_database_response,
233 threads=threads)
240234
241235 # Stop the timer
242236 utils.stop_timer(start_time)
249243 utils.printc(" Registered Azure Virtual Machine DNS Name: {}\n"
250244 .format(hostname), 'green')
251245
252 def check_azure_vms(names, nameserver):
246 def check_azure_vms(names, nameserver, threads):
253247 """
254248 Checks for Azure Virtual Machines
255249 """
271265
272266 # Azure VMs use DNS sub-domains. If it resolves, it is registered.
273267 utils.fast_dns_lookup(candidates, nameserver,
274 callback=print_vm_response)
268 callback=print_vm_response,
269 threads=threads)
275270
276271 # Stop the timer
277272 utils.stop_timer(start_time)
284279
285280 valid_accounts = check_storage_accounts(names, args.threads,
286281 args.nameserver)
287 if valid_accounts:
282 if valid_accounts and not args.quickscan:
288283 brute_force_containers(valid_accounts, args.brute, args.threads)
289284
290 check_azure_websites(names, args.nameserver)
291 check_azure_databases(names, args.nameserver)
292 check_azure_vms(names, args.nameserver)
285 check_azure_websites(names, args.nameserver, args.threads)
286 check_azure_databases(names, args.nameserver, args.threads)
287 check_azure_vms(names, args.nameserver, args.threads)
1212 2017
1313 2018
1414 2019
15 2020
1516 3
1617 4
1718 5
2627 amazon
2728 analytics
2829 android
30 api
2931 app
3032 appengine
3133 appspot
6870 contact
6971 container
7072 content
73 core
7174 corp
7275 corporate
7376 data
101104 files
102105 fileshare
103106 filestore
107 firebase
104108 firestore
105109 functions
110 gateway
106111 gcp
107112 gcp-logs
108113 gcplogs
113118 graphite
114119 graphql
115120 gs
121 gw
116122 help
123 iaas
117124 hub
118125 iam
119126 images
125132 iot
126133 jira
127134 js
135 k8s
128136 kube
129137 kubeengine
130138 kubernetes
150158 oracle
151159 org
152160 packages
161 paas
153162 passwords
154163 photos
155164 pics
173182 repo
174183 reports
175184 resources
185 rtdb
176186 s3
187 saas
177188 screenshots
178189 scripts
179190 sec
204215 subversion
205216 support
206217 svn
218 svc
207219 syslog
208220 tasks
209221 teamcity
222234 users
223235 ux
224236 videos
237 vm
225238 web
226239 website
227240 wp
33 """
44
55 from enum_tools import utils
6 from enum_tools import gcp_regions
67
78 BANNER = '''
89 ++++++++++++++++++++++++++
1011 ++++++++++++++++++++++++++
1112 '''
1213
13 # Known S3 domain names
14 # Known GCP domain names
1415 GCP_URL = 'storage.googleapis.com'
16 FBRTDB_URL = 'firebaseio.com'
1517 APPSPOT_URL = 'appspot.com'
18 FUNC_URL = 'cloudfunctions.net'
19
20 # Hacky, I know. Used to store project/region combos that report at least
21 # one cloud function, to brute force later on
22 HAS_FUNCS = []
1623
1724 def print_bucket_response(reply):
1825 """
5966 # Stop the time
6067 utils.stop_timer(start_time)
6168
69 def print_fbrtdb_response(reply):
70 """
71 Parses the HTTP reply of a brute-force attempt
72
73 This function is passed into the class object so we can view results
74 in real-time.
75 """
76 if reply.status_code == 404:
77 pass
78 elif reply.status_code == 200:
79 utils.printc(" OPEN GOOGLE FIREBASE RTDB: {}\n"
80 .format(reply.url), 'green')
81 elif reply.status_code == 401:
82 utils.printc(" Protected Google Firebase RTDB: {}\n"
83 .format(reply.url), 'orange')
84 elif reply.status_code == 402:
85 utils.printc(" Payment required on Google Firebase RTDB: {}\n"
86 .format(reply.url), 'orange')
87 else:
88 print(" Unknown status codes being received from {}:\n"
89 " {}: {}"
90 .format(reply.url, reply.status_code, reply.reason))
91
92 def check_fbrtdb(names, threads):
93 """
94 Checks for Google Firebase RTDB
95 """
96 print("[+] Checking for Google Firebase Realtime Databases")
97
98 # Start a counter to report on elapsed time
99 start_time = utils.start_timer()
100
101 # Initialize the list of correctly formatted urls
102 candidates = []
103
104 # Take each mutated keyword craft a url with the correct format
105 for name in names:
106 # Firebase RTDB names cannot include a period. We'll exlcude
107 # those from the global candidates list
108 if '.' not in name:
109 candidates.append('{}.{}/.json'.format(name, FBRTDB_URL))
110
111 # Send the valid names to the batch HTTP processor
112 utils.get_url_batch(candidates, use_ssl=True,
113 callback=print_fbrtdb_response,
114 threads=threads,
115 redir=False)
116
117 # Stop the time
118 utils.stop_timer(start_time)
119
62120 def print_appspot_response(reply):
63121 """
64122 Parses the HTTP reply of a brute-force attempt
68126 """
69127 if reply.status_code == 404:
70128 pass
71 elif reply.status_code == 500 or reply.status_code == 503:
129 elif str(reply.status_code)[0] == 5:
72130 utils.printc(" Google App Engine app with a 50x error: {}\n"
73131 .format(reply.url), 'orange')
74 elif reply.status_code == 200 or reply.status_code == 302:
132 elif (reply.status_code == 200
133 or reply.status_code == 302
134 or reply.status_code == 404):
75135 utils.printc(" Google App Engine app: {}\n"
76136 .format(reply.url), 'green')
77137 else:
106166 # Stop the time
107167 utils.stop_timer(start_time)
108168
169 def print_functions_response1(reply):
170 """
171 Parses the HTTP reply the initial Cloud Functions check
172
173 This function is passed into the class object so we can view results
174 in real-time.
175 """
176 if reply.status_code == 404:
177 pass
178 elif reply.status_code == 302:
179 utils.printc(" Contains at least 1 Cloud Function: {}\n"
180 .format(reply.url), 'green')
181 HAS_FUNCS.append(reply.url)
182 else:
183 print(" Unknown status codes being received from {}:\n"
184 " {}: {}"
185 .format(reply.url, reply.status_code, reply.reason))
186
187 def print_functions_response2(reply):
188 """
189 Parses the HTTP reply from the secondary, brute-force Cloud Functions check
190
191 This function is passed into the class object so we can view results
192 in real-time.
193 """
194 if 'accounts.google.com/ServiceLogin' in reply.url:
195 pass
196 elif reply.status_code == 403 or reply.status_code == 401:
197 utils.printc(" Auth required Cloud Function: {}\n"
198 .format(reply.url), 'orange')
199 elif reply.status_code == 405:
200 utils.printc(" UNAUTHENTICATED Cloud Function (POST-Only): {}\n"
201 .format(reply.url), 'green')
202 elif reply.status_code == 200 or reply.status_code == 404:
203 utils.printc(" UNAUTHENTICATED Cloud Function (GET-OK): {}\n"
204 .format(reply.url), 'green')
205 else:
206 print(" Unknown status codes being received from {}:\n"
207 " {}: {}"
208 .format(reply.url, reply.status_code, reply.reason))
209
210 def check_functions(names, brute_list, quickscan, threads):
211 """
212 Checks for Google Cloud Functions running on cloudfunctions.net
213
214 This is a two-part process. First, we want to find region/project combos
215 that have existing Cloud Functions. The URL for a function looks like this:
216 https://[ZONE]-[PROJECT-ID].cloudfunctions.net/[FUNCTION-NAME]
217
218 We look for a 302 in [ZONE]-[PROJECT-ID].cloudfunctions.net. That means
219 there are some functions defined in that region. Then, we brute force a list
220 of possible function names there.
221
222 See gcp_regions.py to define which regions to check. The tool currently
223 defaults to only 1 region, so you should really modify it for best results.
224 """
225 print("[+] Checking for project/zones with Google Cloud Functions.")
226
227 # Start a counter to report on elapsed time
228 start_time = utils.start_timer()
229
230 # Pull the regions from a config file
231 regions = gcp_regions.REGIONS
232
233 print("[*] Testing across {} regions defined in the config file"
234 .format(len(regions)))
235
236 for region in regions:
237 # Initialize the list of initial URLs to check
238 candidates = [region + '-' + name + '.' + FUNC_URL for name in names]
239
240 # Send the valid names to the batch HTTP processor
241 utils.get_url_batch(candidates, use_ssl=False,
242 callback=print_functions_response1,
243 threads=threads,
244 redir=False)
245
246 # Retun from function if we have not found any valid combos
247 if not HAS_FUNCS:
248 utils.stop_timer(start_time)
249 return
250
251 # Also bail out if doing a quick scan
252 if quickscan:
253 return
254
255 # If we did find something, we'll use the brute list. This will allow people
256 # to provide a separate fuzzing list if they choose.
257 print("[*] Brute-forcing function names in {} project/region combos"
258 .format(len(HAS_FUNCS)))
259
260 # Load brute list in memory, based on allowed chars/etc
261 brute_strings = utils.get_brute(brute_list)
262
263 # The global was built in a previous function. We only want to brute force
264 # project/region combos that we know have existing functions defined
265 for func in HAS_FUNCS:
266 print("[*] Brute-forcing {} function names in {}"
267 .format(len(brute_strings), func))
268 # Initialize the list of initial URLs to check. Strip out the HTTP
269 # protocol first, as that is handled in the utility
270 func = func.replace("http://", "")
271
272 # Noticed weird behaviour with functions when a slash is not appended.
273 # Works for some, but not others. However, appending a slash seems to
274 # get consistent results. Might need further validation.
275 candidates = [func + brute + '/' for brute in brute_strings]
276
277 # Send the valid names to the batch HTTP processor
278 utils.get_url_batch(candidates, use_ssl=False,
279 callback=print_functions_response2,
280 threads=threads)
281
282 # Stop the time
283 utils.stop_timer(start_time)
284
109285 def run_all(names, args):
110286 """
111287 Function is called by main program
113289 print(BANNER)
114290
115291 check_gcp_buckets(names, args.threads)
292 check_fbrtdb(names, args.threads)
116293 check_appspot(names, args.threads)
117 return ''
294 check_functions(names, args.brute, args.quickscan, args.threads)
0 """
1 File used to track the DNS regions for GCP resources.
2 """
3
4 # Some enumeration tasks will need to go through the complete list of
5 # possible DNS names for each region. You may want to modify this file to
6 # use the regions meaningful to you.
7 #
8 # Whatever is listed in the last instance of 'REGIONS' below is what the tool
9 # will use.
10
11
12 # Here is the list I get when running `gcloud functions regions list`
13 REGIONS = ['us-central1', 'us-east1', 'us-east4', 'us-west2', 'us-west3',
14 'us-west4', 'europe-west1', 'europe-west2', 'europe-west3',
15 'europe-west6', 'asia-east2', 'asia-northeast1', 'asia-northeast2',
16 'asia-northeast3', 'asia-south1', 'asia-southeast2',
17 'northamerica-northeast1', 'southamerica-east1',
18 'australia-southeast1']
19
20
21 # And here I am limiting the search by overwriting this variable:
22 REGIONS = ['us-central1',]
33
44 import time
55 import sys
6 import subprocess
76 import datetime
87 import re
9 import requests
8 from multiprocessing.dummy import Pool as ThreadPool
9 from functools import partial
1010 try:
11 import requests
12 import dns
13 import dns.resolver
1114 from concurrent.futures import ThreadPoolExecutor
1215 from requests_futures.sessions import FuturesSession
1316 from concurrent.futures._base import TimeoutError
1417 except ImportError:
15 print("[!] You'll need to pip install requests_futures for this tool.")
18 print("[!] Please pip install requirements.txt.")
1619 sys.exit()
1720
1821 LOGFILE = False
3033 log_writer.write("\n\n#### CLOUD_ENUM {} ####\n"
3134 .format(now))
3235
33 def get_url_batch(url_list, use_ssl=False, callback='', threads=5):
36 def get_url_batch(url_list, use_ssl=False, callback='', threads=5, redir=True):
3437 """
3538 Processes a list of URLs, sending the results back to the calling
3639 function in real-time via the `callback` parameter
5053 else:
5154 proto = 'http://'
5255
53 # Start a requests object
54 session = FuturesSession(executor=ThreadPoolExecutor(max_workers=threads))
55
5656 # Using the async requests-futures module, work in batches based on
5757 # the 'queue' list created above. Call each URL, sending the results
5858 # back to the callback function.
5959 for batch in queue:
60 # I used to initialize the session object outside of this loop, BUT
61 # there were a lot of errors that looked related to pool cleanup not
62 # happening. Putting it in here fixes the issue.
63 # There is an unresolved discussion here:
64 # https://github.com/ross/requests-futures/issues/20
65 session = FuturesSession(executor=ThreadPoolExecutor(max_workers=threads+5))
6066 batch_pending = {}
6167 batch_results = {}
6268
6369 # First, grab the pending async request and store it in a dict
6470 for url in batch:
65 batch_pending[url] = session.get(proto + url)
71 batch_pending[url] = session.get(proto + url, allow_redirects=redir)
6672
6773 # Then, grab all the results from the queue.
6874 # This is where we need to catch exceptions that occur with large
7278 # Timeout is set due to observation of some large jobs simply
7379 # hanging forever with no exception raised.
7480 batch_results[url] = batch_pending[url].result(timeout=30)
75 except requests.exceptions.ConnectionError:
76 print(" [!] Connection error on {}. Investigate if there"
77 " are many of these.".format(url))
81 except requests.exceptions.ConnectionError as error_msg:
82 print(" [!] Connection error on {}:".format(url))
83 print(error_msg)
7884 except TimeoutError:
7985 print(" [!] Timeout on {}. Investigate if there are"
8086 " many of these".format(url))
97103 # Clear the status message
98104 sys.stdout.write(' \r')
99105
100 def fast_dns_lookup(names, nameserver, callback='', threads=25):
101 """
102 Helper function to resolve DNS names. Uses subprocess for threading.
106 def dns_lookup(nameserver, name):
107 """
108 This function performs the actual DNS lookup when called in a threadpool
109 by the fast_dns_lookup function.
110 """
111 res = dns.resolver.Resolver()
112 res.timeout = 10
113 res.nameservers = [nameserver]
114
115 try:
116 res.query(name)
117 # If no exception is thrown, return the valid name
118 return name
119 except dns.resolver.NXDOMAIN:
120 return ''
121 except dns.exception.Timeout:
122 print(" [!] DNS Timeout on {}. Investigate if there are many"
123 " of these.".format(name))
124
125 def fast_dns_lookup(names, nameserver, callback='', threads=5):
126 """
127 Helper function to resolve DNS names. Uses multithreading.
103128 """
104129 total = len(names)
105130 current = 0
110135 # Break the url list into smaller lists based on thread size
111136 queue = [names[x:x+threads] for x in range(0, len(names), threads)]
112137
113 # Work through the smaller lists in batches. Using Python's subprocess
114 # module, those host OS will execute the `host` command. Python will
115 # move on to the next and then check the output of the OS command when
116 # finished queueing the batch. A status code of 0 means the host lookup
117 # succeeded.
118138 for batch in queue:
119 batch_pending = {}
120 batch_results = {}
121
122 # First, grab the pending async request and store it in a dict
123 for name in batch:
124 # Build the OS command to lookup a DNS name
125 cmd = ['host', '{}'.format(name), '{}'.format(nameserver)]
126
127 # Run the command and store the pending output
128 batch_pending[name] = subprocess.Popen(cmd,
129 stdout=subprocess.DEVNULL,
130 stderr=subprocess.DEVNULL)
131
132 # Then, grab all the results from the queue
133 for name in batch_pending:
134 batch_pending[name].wait()
135 batch_results[name] = batch_pending[name].poll()
136
137 # If we get a 0, save it as a valid DNS name and send to callback
138 # if defined.
139 if batch_results[name] == 0:
140 valid_names.append(name)
139 pool = ThreadPool(threads)
140
141 # Because pool.map takes only a single function arg, we need to
142 # define this partial so that each iteration uses the same ns
143 dns_lookup_params = partial(dns_lookup, nameserver)
144
145 results = pool.map(dns_lookup_params, batch)
146
147 # We should now have the batch of results back, process them.
148 for name in results:
149 if name:
141150 if callback:
142151 callback(name)
143
144 # Refresh a status message
152 valid_names.append(name)
153
145154 current += threads
155
156 # Update the status message
146157 sys.stdout.flush()
147158 sys.stdout.write(" {}/{} complete...".format(current, total))
148159 sys.stdout.write('\r')
160 pool.close()
149161
150162 # Clear the status message
151163 sys.stdout.write(' \r')
152164
153 # Return the list of valid dns names
154165 return valid_names
155166
156167 def list_bucket_contents(bucket):
203214 with open(LOGFILE, 'a') as log_writer:
204215 log_writer.write(text.lstrip())
205216
217 def get_brute(brute_file, mini=1, maxi=63, banned='[^a-z0-9_-]'):
218 """
219 Generates a list of brute-force words based on length and allowed chars
220 """
221 # Read the brute force file into memory
222 with open(brute_file, encoding="utf8", errors="ignore") as infile:
223 names = infile.read().splitlines()
224
225 # Clean up the names to usable for containers
226 banned_chars = re.compile(banned)
227 clean_names = []
228 for name in names:
229 name = name.lower()
230 name = banned_chars.sub('', name)
231 if maxi >= len(name) >= mini:
232 if name not in clean_names:
233 clean_names.append(name)
234
235 return clean_names
236
206237 def start_timer():
207238 """
208239 Starts a timer for functions in main module
0 requests_futures==0.9.9
0 dnspython
1 requests
2 requests_futures