Codebase list python-faraday / 88e27ed
Update upstream source from tag 'upstream/3.15.0' Update to upstream version '3.15.0' with Debian dir da9454d01ddd533cf447925f0f716ddd8c18a0d6 Sophie Brun 3 years ago
191 changed file(s) with 2713 addition(s) and 2294 deletion(s). Raw diff Collapse all Expand all
104104 - mkdir -p ~/.config/cachix
105105 - export USER=$(whoami)
106106 - echo "$CACHIX_CONFG" >~/.config/cachix/cachix.dhall
107 - !reference [ .clone_and_replace_www, script ]
107108 - cachix use faradaysec
108 - nix-build ./release.nix -A dockerImage --argstr dockerName $CI_REGISTRY_IMAGE --argstr dockerTag latest
109 - nix-build ./release.nix -A dockerImage --argstr dockerName $CI_REGISTRY_IMAGE --argstr dockerTag latest --arg useLastCommit false
109110 - cp $(readlink result) faraday-server-docker.tar.gz
110111 artifacts:
111112 paths:
2020 image: python:3
2121 stage: publish
2222 script:
23 - !reference [ .clone_and_replace_www, script ]
2324 - apt-get update -qy
2425 - apt-get install twine -y
2526 - python setup.py sdist bdist_wheel
0 .qa_integration:
0 qa_integration:
11 stage: upload_testing
22 variables:
33 REMOTE_BRANCH: $CI_COMMIT_REF_NAME
0 * Use Python 3 instead of Python 2 in the Faraday Server
1 * Add ability to manage agents with multiple executors
2 * Agents can be run with custom arguments
3 * Improved processing of uploaded reports. Now it is much faster!
4 * Add custom fields of type `choice`
5 * Fix vuln status transition in bulk create API (mark closed vulns as re-opened when they are triggered again)
6 * Fix bug when using non-existent workspaces in Faraday GTK Client
7 * Set service name as required in the Web UI
8 * Validate the start date of a workspace is not greater than the end date
9 * Fix command API when year is invalid
10 * When SSL misconfigurations cause websockets to fails it doesn't block server from starting
11 * Check for invalid service port number in the Web UI
12 * Fix dashboard tooltips for vulnerability
13 * Fix bug when GTK client lost connection to the server
14 * Fix style issues in "Hosts by Service" modal of the dashboard
15 * Add API for bulk delete of vulnerabilities
16 * Add missing vuln attributes to exported CSV
17 * `faraday-manage support` now displays the Operating System version
18 * Notify when `faraday-manage` can't run becasue of PostgreSQL HBA config error
+0
-19
CHANGELOG/3.10/white.md less more
0 * Use Python 3 instead of Python 2 in the Faraday Server
1 * Add ability to manage agents with multiple executors
2 * Agents can be run with custom arguments
3 * Improved processing of uploaded reports. Now it is much faster!
4 * Add custom fields of type `choice`
5 * Fix vuln status transition in bulk create API (mark closed vulns as re-opened when they are triggered again)
6 * Fix bug when using non-existent workspaces in Faraday GTK Client
7 * Set service name as required in the Web UI
8 * Validate the start date of a workspace is not greater than the end date
9 * Fix command API when year is invalid
10 * When SSL misconfigurations cause websockets to fails it doesn't block server from starting
11 * Check for invalid service port number in the Web UI
12 * Fix dashboard tooltips for vulnerability
13 * Fix bug when GTK client lost connection to the server
14 * Fix style issues in "Hosts by Service" modal of the dashboard
15 * Add API for bulk delete of vulnerabilities
16 * Add missing vuln attributes to exported CSV
17 * `faraday-manage support` now displays the Operating System version
18 * Notify when `faraday-manage` can't run becasue of PostgreSQL HBA config error
0 * Fix installation with `pip install --no-binary :all: faradaysec`
1 * Force usage of webargs 5 (webargs 6 broke backwards compatibility)
2 * Use latest version of faraday-plugins
3 * Fix broken "Faraday Plugin" menu entry in the GTK client
4 * Extract export csv to reuse for reports
+0
-5
CHANGELOG/3.10.1/white.md less more
0 * Fix installation with `pip install --no-binary :all: faradaysec`
1 * Force usage of webargs 5 (webargs 6 broke backwards compatibility)
2 * Use latest version of faraday-plugins
3 * Fix broken "Faraday Plugin" menu entry in the GTK client
4 * Extract export csv to reuse for reports
0 * Fix Cross-Site Request Forgery (CSRF) vulnerability in all JSON API endpoints.
1 This was caused because a third-party library doesn't implement proper
2 Content-Type header validation. To mitigate the vulnerability, we set the
3 session cookie to have the `SameSite: Lax` property.
4 * Fix Faraday Server logs were always in debug
5 * Add update date column when exporting vulnerabilities to CSV
6 * Fix unicode error when exporting vulnerabilities to CSV
+0
-7
CHANGELOG/3.10.2/white.md less more
0 * Fix Cross-Site Request Forgery (CSRF) vulnerability in all JSON API endpoints.
1 This was caused because a third-party library doesn't implement proper
2 Content-Type header validation. To mitigate the vulnerability, we set the
3 session cookie to have the `SameSite: Lax` property.
4 * Fix Faraday Server logs were always in debug
5 * Add update date column when exporting vulnerabilities to CSV
6 * Fix unicode error when exporting vulnerabilities to CSV
0 * Move GTK client to [another repository](https://github.com/infobyte/faraday-client) to improve release times.
1 * Fix formula injection vulnerability when exporting vulnerability data to CSV. This was considered a low impact vulnerability.
2 * Remove "--ssl" parameter. Read SSL information from the config file.
3 * Add OpenAPI autogenerated documentation support
4 * Show agent information in command history
5 * Add bulk delete endpoint for hosts API
6 * Add column with information to track agent execution data
7 * Add tool attribute to vulnerability to avoid incorrectly showing "Web UI" as creator tool
8 * Add sorting by target in credentials view
9 * Add creator information when uploading reports or using de bulk create api
10 * Add feature to disable rules in the searcher
11 * Add API endpoint to export Faraday data to Metasploit XML format
12 * Change websocket url route from / to /websockets
13 * Use run date instead of creation date when plugins report specifies it
14 * Improve knowledge base UX
15 * Improve workspace table and status report table UX.
16 * Improve format of exported CSV to include more fields
17 * Sort results in count API endpoint
18 * Limit description width in knowledge base
19 * Change log date format to ISO 8601
20 * Fix parsing server port config in server.ini
21 * Fix bug when \_rev was send to the hosts API
22 * Send JSON response when you get a 500 or 404 error
23 * Fix bug parsing invalid data in NullToBlankString
24
25 Changes in plugins (only available through Web UI, not in GTK client yet):
26
27 New plugins:
28
29 * Checkmarx
30 * Faraday\_csv (output of exported Faraday csv)
31 * Qualyswebapp
32 * Whitesource
33
34 Updated plugins:
35
36 * Acunetix
37 * AppScan
38 * Arachni
39 * Nessus
40 * Netspaker
41 * Netspaker cloud
42 * Nexpose
43 * Openvas
44 * QualysGuard
45 * Retina
46 * W3af
47 * WPScan
48 * Webinspect
49 * Zap
+0
-50
CHANGELOG/3.11/white.md less more
0 * Move GTK client to [another repository](https://github.com/infobyte/faraday-client) to improve release times.
1 * Fix formula injection vulnerability when exporting vulnerability data to CSV. This was considered a low impact vulnerability.
2 * Remove "--ssl" parameter. Read SSL information from the config file.
3 * Add OpenAPI autogenerated documentation support
4 * Show agent information in command history
5 * Add bulk delete endpoint for hosts API
6 * Add column with information to track agent execution data
7 * Add tool attribute to vulnerability to avoid incorrectly showing "Web UI" as creator tool
8 * Add sorting by target in credentials view
9 * Add creator information when uploading reports or using de bulk create api
10 * Add feature to disable rules in the searcher
11 * Add API endpoint to export Faraday data to Metasploit XML format
12 * Change websocket url route from / to /websockets
13 * Use run date instead of creation date when plugins report specifies it
14 * Improve knowledge base UX
15 * Improve workspace table and status report table UX.
16 * Improve format of exported CSV to include more fields
17 * Sort results in count API endpoint
18 * Limit description width in knowledge base
19 * Change log date format to ISO 8601
20 * Fix parsing server port config in server.ini
21 * Fix bug when \_rev was send to the hosts API
22 * Send JSON response when you get a 500 or 404 error
23 * Fix bug parsing invalid data in NullToBlankString
24
25 Changes in plugins (only available through Web UI, not in GTK client yet):
26
27 New plugins:
28
29 * Checkmarx
30 * Faraday\_csv (output of exported Faraday csv)
31 * Qualyswebapp
32 * Whitesource
33
34 Updated plugins:
35
36 * Acunetix
37 * AppScan
38 * Arachni
39 * Nessus
40 * Netspaker
41 * Netspaker cloud
42 * Nexpose
43 * Openvas
44 * QualysGuard
45 * Retina
46 * W3af
47 * WPScan
48 * Webinspect
49 * Zap
0 * Fix missing shodan icon and invalid link in dashboard and hosts list
1 * Upgrade marshmallow, webargs, werkzeug and flask-login dependencies to
2 latest versions in order to make packaging for distros easier
+0
-3
CHANGELOG/3.11.1/white.md less more
0 * Fix missing shodan icon and invalid link in dashboard and hosts list
1 * Upgrade marshmallow, webargs, werkzeug and flask-login dependencies to
2 latest versions in order to make packaging for distros easier
0 * Now agents can upload data to multiples workspaces
1 * Add agent and executor data to Activity Feed
2 * Add session timeout configuration to server.ini configuration file
3 * Add hostnames to already existing hosts when importing a report
4 * Add new faraday background image
5 * Display an error when uploading an invalid report
6 * Use minimized JS libraries to improve page load time
7 * Fix aspect ratio distortion in evidence tab of vulnerability preview
8 * Fix broken Knowledge Base upload modal
9 * Fix closing of websocket connections when communicating with Agents
10 * Change Custom Fields names in exported CSV to make columns compatible with
11 `faraday_csv` plugin
12 * Fix import CSV for vuln template: some values were overwritten with default values.
13 * Catch errors in faraday-manage commands when the connection string is not
14 specified in the server.ini file
15 * Fix bug that generated a session when using Token authentication
16 * Fix bug that requested to the API when an invalid filter is used
17 * Cleanup old sessions when a user logs in
18 * Remove unmaintained Flask-Restless dependency
19 * Remove pbkdf2\_sha1 and plain password schemes. We only support bcrypt
+0
-20
CHANGELOG/3.12/white.md less more
0 * Now agents can upload data to multiples workspaces
1 * Add agent and executor data to Activity Feed
2 * Add session timeout configuration to server.ini configuration file
3 * Add hostnames to already existing hosts when importing a report
4 * Add new faraday background image
5 * Display an error when uploading an invalid report
6 * Use minimized JS libraries to improve page load time
7 * Fix aspect ratio distortion in evidence tab of vulnerability preview
8 * Fix broken Knowledge Base upload modal
9 * Fix closing of websocket connections when communicating with Agents
10 * Change Custom Fields names in exported CSV to make columns compatible with
11 `faraday_csv` plugin
12 * Fix import CSV for vuln template: some values were overwritten with default values.
13 * Catch errors in faraday-manage commands when the connection string is not
14 specified in the server.ini file
15 * Fix bug that generated a session when using Token authentication
16 * Fix bug that requested to the API when an invalid filter is used
17 * Cleanup old sessions when a user logs in
18 * Remove unmaintained Flask-Restless dependency
19 * Remove pbkdf2\_sha1 and plain password schemes. We only support bcrypt
0 * ADD RESTless filter to multiples views, improving the searchs
1 * ADD "extras" modal in options menu, linking to other Faraday resources
2 * ADD `import vulnerability templates` command to faraday-manage
3 * ADD `generate nginx config` command to faraday-manage
4 * ADD vulnerabilities severities count to host
5 * ADD Active Agent columns to workspace
6 * ADD critical vulns count to workspace
7 * ADD `Remember me` login option
8 * ADD distinguish host flag
9 * ADD a create_date field to comments
10 * FIX to use new webargs version
11 * FIX Custom Fields view in KB (Vulnerability Templates)
12 * FIX bug on filter endpoint for vulnerabilities with offset and limit parameters
13 * FIX bug raising `403 Forbidden` HTTP error when the first workspace was not active
14 * FIX bug when changing the token expiration change
15 * FIX bug in Custom Fields type Choice when choice name is too long.
16 * FIX Vulnerability Filter endpoint Performance improvement using joinedload. Removed several nplusone uses
17 * MOD Updating the template.ini for new installations
18 * MOD Improve SMTP configuration
19 * MOD The agent now indicates how much time it had run (faraday-agent-dispatcher v1.4.0)
20 * MOD Type "Vulnerability Web" cannot have "Host" type as a parent when creating data in bulk
21 * MOD Expiration default time from 1 month to 12 hour
22 * MOD Improve data reference when uploading a new report
23 * MOD Refactor Knowledge Base's bulk create to take to take also multiple creation from vulns in status report.
24 * MOD All HTTP OPTIONS endpoints are now public
25 * MOD Change documentation and what's new links in about
26 * REMOVE Flask static endpoint
27 * REMOVE of our custom logger
+0
-28
CHANGELOG/3.14.0/white.md less more
0 * ADD RESTless filter to multiples views, improving the searchs
1 * ADD "extras" modal in options menu, linking to other Faraday resources
2 * ADD `import vulnerability templates` command to faraday-manage
3 * ADD `generate nginx config` command to faraday-manage
4 * ADD vulnerabilities severities count to host
5 * ADD Active Agent columns to workspace
6 * ADD critical vulns count to workspace
7 * ADD `Remember me` login option
8 * ADD distinguish host flag
9 * ADD a create_date field to comments
10 * FIX to use new webargs version
11 * FIX Custom Fields view in KB (Vulnerability Templates)
12 * FIX bug on filter endpoint for vulnerabilities with offset and limit parameters
13 * FIX bug raising `403 Forbidden` HTTP error when the first workspace was not active
14 * FIX bug when changing the token expiration change
15 * FIX bug in Custom Fields type Choice when choice name is too long.
16 * FIX Vulnerability Filter endpoint Performance improvement using joinedload. Removed several nplusone uses
17 * MOD Updating the template.ini for new installations
18 * MOD Improve SMTP configuration
19 * MOD The agent now indicates how much time it had run (faraday-agent-dispatcher v1.4.0)
20 * MOD Type "Vulnerability Web" cannot have "Host" type as a parent when creating data in bulk
21 * MOD Expiration default time from 1 month to 12 hour
22 * MOD Improve data reference when uploading a new report
23 * MOD Refactor Knowledge Base's bulk create to take to take also multiple creation from vulns in status report.
24 * MOD All HTTP OPTIONS endpoints are now public
25 * MOD Change documentation and what's new links in about
26 * REMOVE Flask static endpoint
27 * REMOVE of our custom logger
0 * ADD forgot password
1 * ADD update services by bulk_create
2 * ADD FARADAY_DISABLE_LOGS varibale to disable logs to filesystem
3 * ADD security logs in `audit.log` file
4 * UPD security dependency Flask-Security-Too v3.4.4
5 * MOD rename total_rows field in filter host response
6 * MOD improved Export cvs performance by reducing the number of queries
7 * MOD sanitize the content of vulns' request and response
8 * MOD dont strip new line in description when exporting csv
9 * MOD improved threads management on exception
10 * MOD improved performance on vulnerability filter
11 * MOD improved [API documentation](www.api.faradaysec.com)
12 * FIX upload a report with invalid custom fields
13 * ADD v3 API, which includes:
14 * All endpoints ends without `/`
15 * `PATCH {model}/id` endpoints
16 * ~~Bulk update via PATCH `{model}` endpoints~~ In a future release
17 * ~~Bulk delete via DELETE `{model}` endpoints~~ In a future release
18 * Endpoints removed:
19 * `/v2/ws/<workspace_id>/activate/`
20 * `/v2/ws/<workspace_id>/change_readonly/`
21 * `/v2/ws/<workspace_id>/deactivate/`
22 * `/v2/ws/<workspace_name>/hosts/bulk_delete/`
23 * `/v2/ws/<workspace_name>/vulns/bulk_delete/`
24 * Endpoints updated:
25 * `/v2/ws/<workspace_name>/vulns/<int:vuln_id>/attachments/` => \
26 `/v3/ws/<workspace_name>/vulns/<int:vuln_id>/attachment`
+0
-27
CHANGELOG/3.14.1/white.md less more
0 * ADD forgot password
1 * ADD update services by bulk_create
2 * ADD FARADAY_DISABLE_LOGS varibale to disable logs to filesystem
3 * ADD security logs in `audit.log` file
4 * UPD security dependency Flask-Security-Too v3.4.4
5 * MOD rename total_rows field in filter host response
6 * MOD improved Export cvs performance by reducing the number of queries
7 * MOD sanitize the content of vulns' request and response
8 * MOD dont strip new line in description when exporting csv
9 * MOD improved threads management on exception
10 * MOD improved performance on vulnerability filter
11 * MOD improved [API documentation](www.api.faradaysec.com)
12 * FIX upload a report with invalid custom fields
13 * ADD v3 API, which includes:
14 * All endpoints ends without `/`
15 * `PATCH {model}/id` endpoints
16 * ~~Bulk update via PATCH `{model}` endpoints~~ In a future release
17 * ~~Bulk delete via DELETE `{model}` endpoints~~ In a future release
18 * Endpoints removed:
19 * `/v2/ws/<workspace_id>/activate/`
20 * `/v2/ws/<workspace_id>/change_readonly/`
21 * `/v2/ws/<workspace_id>/deactivate/`
22 * `/v2/ws/<workspace_name>/hosts/bulk_delete/`
23 * `/v2/ws/<workspace_name>/vulns/bulk_delete/`
24 * Endpoints updated:
25 * `/v2/ws/<workspace_name>/vulns/<int:vuln_id>/attachments/` => \
26 `/v3/ws/<workspace_name>/vulns/<int:vuln_id>/attachment`
0 * ADD New plugins:
1 * microsoft baseline security analyzer
2 * nextnet
3 * openscap
4 * FIX old versions of Nessus plugins bugs
+0
-5
CHANGELOG/3.14.2/white.md less more
0 * ADD New plugins:
1 * microsoft baseline security analyzer
2 * nextnet
3 * openscap
4 * FIX old versions of Nessus plugins bugs
0 * MOD MAYOR Breaking change: Use frontend from other repository
1 * ADD `last_run` to executors and agents
2 * ADD ignore info vulns option (from faraday-plugins 1.4.3)
3 * ADD invalid logins are registered in `audit.log`
4 * ADD agent registration tokens are now 6-digit short and automatically regenerated every 30 seconds
5 * MOD Fix logout redirect loop
6 * REMOVE support for native SSL
+0
-7
CHANGELOG/3.14.3/white.md less more
0 * MOD MAYOR Breaking change: Use frontend from other repository
1 * ADD `last_run` to executors and agents
2 * ADD ignore info vulns option (from faraday-plugins 1.4.3)
3 * ADD invalid logins are registered in `audit.log`
4 * ADD agent registration tokens are now 6-digit short and automatically regenerated every 30 seconds
5 * MOD Fix logout redirect loop
6 * REMOVE support for native SSL
0 * Updated plugins package, which update appscan plugin
+0
-1
CHANGELOG/3.14.4/white.md less more
0 * Updated plugins package, which update appscan plugin
0 * ADD `Basic Auth` support
1 * ADD support for GET method in websocket_tokens, POST will be deprecated in the future
2 * ADD CVSS(String), CWE(String), CVE(relationship) columns to vulnerability model and API
3 * ADD agent token's API says the renewal cycling duration
4 * MOD Improve database model to be able to delete workspaces fastly
5 * MOD Improve code style and uses (less flake8 exceptions, py3 `super` style, Flask app as singleton, etc)
6 * MOD workspaces' names regex to verify they cannot contain forward slash (`/`)
7 * MOD Improve bulk create logs
8 * FIX Own schema breaking Marshmallow 3.11.0+
9 * UPD flask_security_too to version 4.0.0+
0 May 18th, 2021
0 * Added logical operator AND to status report search
1 * Restkit dependency removed.
2 * Improvement on manage.py change-password
3 * Add feature to show only unconfirmed vulns.
4 * Add ssl information to manage.py status-check
5 * Update wpscan plugin to support latest version.
6 * Allow workspace names starting with numbers.
+0
-7
CHANGELOG/3.2/white.md less more
0 * Added logical operator AND to status report search
1 * Restkit dependency removed.
2 * Improvement on manage.py change-password
3 * Add feature to show only unconfirmed vulns.
4 * Add ssl information to manage.py status-check
5 * Update wpscan plugin to support latest version.
6 * Allow workspace names starting with numbers.
0 * Add workspace disable feature
1 * Add mac vendor to host and services
2 * Fix typos and add sorting in workspace name (workspace list view)
3 * Improve warning when you try to select hosts instead of services as targets of a Vulnerability Web
4 * Deleted old Nexpose plugin. Now Faraday uses Nexpose-Full.
5 * Update sqlmap plugin
6 * Add updated zap plugin
7 * Add hostnames to nessus plugin
8 * Python interpreter in SSLCheck plugin is not hardcoded anymore.
9 * Fix importer key error when some data from couchdb didn't contain the "type" key
10 * Fix AttributeError when importing vulns without exploitation from CouchDB
11 * Fix KeyError in importer.py. This issue occurred during the import of Vulnerability Templates
12 * Fix error when file config.xml doesn't exist as the moment of executing initdb
13 * Improve invalid credentials warning by indicating the user to run Faraday GTK with --login option
14 * Fix typos in VulnDB and add two new vulnerabilities (Default Credentials, Privilege Escalation)
15 * Improved tests performance with new versions of the Faker library
16 * `abort()` calls were checked and changed to `flask.abort()`
+0
-17
CHANGELOG/3.3/white.md less more
0 * Add workspace disable feature
1 * Add mac vendor to host and services
2 * Fix typos and add sorting in workspace name (workspace list view)
3 * Improve warning when you try to select hosts instead of services as targets of a Vulnerability Web
4 * Deleted old Nexpose plugin. Now Faraday uses Nexpose-Full.
5 * Update sqlmap plugin
6 * Add updated zap plugin
7 * Add hostnames to nessus plugin
8 * Python interpreter in SSLCheck plugin is not hardcoded anymore.
9 * Fix importer key error when some data from couchdb didn't contain the "type" key
10 * Fix AttributeError when importing vulns without exploitation from CouchDB
11 * Fix KeyError in importer.py. This issue occurred during the import of Vulnerability Templates
12 * Fix error when file config.xml doesn't exist as the moment of executing initdb
13 * Improve invalid credentials warning by indicating the user to run Faraday GTK with --login option
14 * Fix typos in VulnDB and add two new vulnerabilities (Default Credentials, Privilege Escalation)
15 * Improved tests performance with new versions of the Faker library
16 * `abort()` calls were checked and changed to `flask.abort()`
0 * In GTK, check active_workspace its not null
1 * Add fbruteforce services fplugin
2 * Attachments can be added to a vulnerability through the API.
3 * Catch gaierror error on lynis plugin
4 * Add OR and NOT with parenthesis support on status report search
5 * Info API now is public
6 * Web UI now detects Appscan plugin
7 * Improve performance on the workspace using cusotm query
8 * Workspaces can be set as active/disable in welcome page.
9 * Change Nmap plugin, response field in VulnWeb now goes to Data field.
10 * Update code to support latest SQLAlchemy version
11 * Fix `create_vuln` fplugin bug that incorrectly reported duplicated vulns
12 * Attachments on a vulnerability can be deleted through the API.
13 * Improvement in the coverage of the tests.
+0
-14
CHANGELOG/3.4/white.md less more
0 * In GTK, check active_workspace its not null
1 * Add fbruteforce services fplugin
2 * Attachments can be added to a vulnerability through the API.
3 * Catch gaierror error on lynis plugin
4 * Add OR and NOT with parenthesis support on status report search
5 * Info API now is public
6 * Web UI now detects Appscan plugin
7 * Improve performance on the workspace using cusotm query
8 * Workspaces can be set as active/disable in welcome page.
9 * Change Nmap plugin, response field in VulnWeb now goes to Data field.
10 * Update code to support latest SQLAlchemy version
11 * Fix `create_vuln` fplugin bug that incorrectly reported duplicated vulns
12 * Attachments on a vulnerability can be deleted through the API.
13 * Improvement in the coverage of the tests.
0 * Redesgin of new/edit vulnerability forms
1 * Add new custom fields feature to vulnerabilities
2 * Add ./manage.py migrate to perform alembic migrations
3 * Faraday will use webargs==4.4.1 because webargs==5.0.0 fails with Python2
4 * New system for online plugins using Threads, a few fixes for metasploit plugin online also.
5 * Fix Command "python manage.py process-reports" now stops once all reports have been processed
6 * Fix bug in query when it checks if a vulnerability or a workspace exists
7 * Fix Once a workspace is created through the web UI, a folder with its name is created inside ~/.faraday/report/
8 * The manage.py now has a new support funtionality that creates a .zip file with all the information faraday's support team will need to throubleshoot your issue
9 * Status-check checks PostgreSQL encoding
10 * Fix a bug when fail importation of reports, command duration say "In Progress" forever.
11 * Fix confirmed bug in vulns API
12 * Update websockets code to use latest lib version
13 * bootstrap updated to v3.4.0
14 * Manage.py support now throws a message once it finishes the process.
15 * Update Lynis to its version 2.7.1
16 * Updated arp-scan plugin, added support in the Host class for mac address which was deprecated before v3.0
17 * OpenVAS Plugin now supports OpenVAS v-9.0.3
+0
-18
CHANGELOG/3.5/white.md less more
0 * Redesgin of new/edit vulnerability forms
1 * Add new custom fields feature to vulnerabilities
2 * Add ./manage.py migrate to perform alembic migrations
3 * Faraday will use webargs==4.4.1 because webargs==5.0.0 fails with Python2
4 * New system for online plugins using Threads, a few fixes for metasploit plugin online also.
5 * Fix Command "python manage.py process-reports" now stops once all reports have been processed
6 * Fix bug in query when it checks if a vulnerability or a workspace exists
7 * Fix Once a workspace is created through the web UI, a folder with its name is created inside ~/.faraday/report/
8 * The manage.py now has a new support funtionality that creates a .zip file with all the information faraday's support team will need to throubleshoot your issue
9 * Status-check checks PostgreSQL encoding
10 * Fix a bug when fail importation of reports, command duration say "In Progress" forever.
11 * Fix confirmed bug in vulns API
12 * Update websockets code to use latest lib version
13 * bootstrap updated to v3.4.0
14 * Manage.py support now throws a message once it finishes the process.
15 * Update Lynis to its version 2.7.1
16 * Updated arp-scan plugin, added support in the Host class for mac address which was deprecated before v3.0
17 * OpenVAS Plugin now supports OpenVAS v-9.0.3
0 * Fix CSRF (Cross-Site Request Forgery) vulnerability in vulnerability attachments API.
1 This allowed an attacker to upload evidence to vulns. He/she required to know the
2 desired workspace name and vulnerability id so it complicated the things a bit. We
3 classified this vuln as a low impact one.
4 * Readonly and disabled workspaces
5 * Add fields 'impact', 'easeofresolution' and 'policyviolations' to vulnerability_template
6 * Add pagination in 'Command history', 'Last Vulnerabilities', 'Activity logs' into dashboard
7 * Add status_code field to web vulnerability
8 * Preserve selection after bulk edition of vulnerabilities in the Web UI
9 * Faraday's database will be created using UTF-8 encoding
10 * Fix bug of "select a different workspace" from an empty list loop.
11 * Fix bug when creating duplicate custom fields
12 * Fix bug when loading in server.ini with extra configs
13 * Fix `./manage.py command`. It wasn't working since the last schema migration
14 * `./manage.py createsuperuser` command renamed to `./manage.py create-superuser`
15 * Fix bug when non-numeric vulnerability IDs were passed to the attachments API
16 * Fix logic in search exploits
17 * Add ability to 'Searcher' to execute rules in loop with dynamic variables
18 * Send searcher alert with custom mail
19 * Add gitlab-ci.yml file to execute test and pylint on gitlab runner
20 * Fix 500 error when updating services and vulns with specific read-only parameters set
21 * Fix SQLMap plugin to support newer versions of the tool
22 * Improve service's parser for Lynis plugin
23 * Fix bug when parsing URLs in Acunetix reports
24 * Fix and update NetSparker Plugin
25 * Fix bug in nessus plugin. It was trying to create a host without IP. Enabled logs on the server for plugin processing (use --debug)
26 * Fix bug when parsing hostnames in Nessus reports
27 * Fix SSLyze report automatic detection, so reports can be imported from the web ui
28 * Update Dnsmap Plugin
+0
-29
CHANGELOG/3.6/white.md less more
0 * Fix CSRF (Cross-Site Request Forgery) vulnerability in vulnerability attachments API.
1 This allowed an attacker to upload evidence to vulns. He/she required to know the
2 desired workspace name and vulnerability id so it complicated the things a bit. We
3 classified this vuln as a low impact one.
4 * Readonly and disabled workspaces
5 * Add fields 'impact', 'easeofresolution' and 'policyviolations' to vulnerability_template
6 * Add pagination in 'Command history', 'Last Vulnerabilities', 'Activity logs' into dashboard
7 * Add status_code field to web vulnerability
8 * Preserve selection after bulk edition of vulnerabilities in the Web UI
9 * Faraday's database will be created using UTF-8 encoding
10 * Fix bug of "select a different workspace" from an empty list loop.
11 * Fix bug when creating duplicate custom fields
12 * Fix bug when loading in server.ini with extra configs
13 * Fix `./manage.py command`. It wasn't working since the last schema migration
14 * `./manage.py createsuperuser` command renamed to `./manage.py create-superuser`
15 * Fix bug when non-numeric vulnerability IDs were passed to the attachments API
16 * Fix logic in search exploits
17 * Add ability to 'Searcher' to execute rules in loop with dynamic variables
18 * Send searcher alert with custom mail
19 * Add gitlab-ci.yml file to execute test and pylint on gitlab runner
20 * Fix 500 error when updating services and vulns with specific read-only parameters set
21 * Fix SQLMap plugin to support newer versions of the tool
22 * Improve service's parser for Lynis plugin
23 * Fix bug when parsing URLs in Acunetix reports
24 * Fix and update NetSparker Plugin
25 * Fix bug in nessus plugin. It was trying to create a host without IP. Enabled logs on the server for plugin processing (use --debug)
26 * Fix bug when parsing hostnames in Nessus reports
27 * Fix SSLyze report automatic detection, so reports can be imported from the web ui
28 * Update Dnsmap Plugin
0 * Add vulnerability preview to status report
1 * Update Fierce Plugin. Import can be done from GTK console.
2 * Update Goohost plugin and now Faraday imports Goohost .txt report.
3 * Update plugin for support WPScan v-3.4.5
4 * Update Qualysguard plugin to its 8.17.1.0.2 version
5 * Update custom fields with Searcher
6 * Update Recon-ng Plugin so that it accepts XML reports
7 * Add postres version to status-change command
8 * Couchdb configuration section will not be added anymore
9 * Add unit test for config/default.xml
+0
-10
CHANGELOG/3.7/white.md less more
0 * Add vulnerability preview to status report
1 * Update Fierce Plugin. Import can be done from GTK console.
2 * Update Goohost plugin and now Faraday imports Goohost .txt report.
3 * Update plugin for support WPScan v-3.4.5
4 * Update Qualysguard plugin to its 8.17.1.0.2 version
5 * Update custom fields with Searcher
6 * Update Recon-ng Plugin so that it accepts XML reports
7 * Add postres version to status-change command
8 * Couchdb configuration section will not be added anymore
9 * Add unit test for config/default.xml
+0
-0
CHANGELOG/3.7.2/white.md less more
(Empty file)
0 * Add parser for connection string at PGCli connection
1 * Fix bug when using custom fields, we must use the field_name instead of the display_name
2 * Fix user's menu visibily when vuln detail is open.
3 * Fix bug in status report that incorrectly showed standard vulns like if they were vulnwebs
+0
-4
CHANGELOG/3.7.3/white.md less more
0 * Add parser for connection string at PGCli connection
1 * Fix bug when using custom fields, we must use the field_name instead of the display_name
2 * Fix user's menu visibily when vuln detail is open.
3 * Fix bug in status report that incorrectly showed standard vulns like if they were vulnwebs
0 * Refactor the project to use absolute imports to make the installation easier
1 (with a setup.py file). This also was a first step to make our codebase
2 compatible with python 3.
3 * Change the commands used to run faraday. `./faraday-server.py`,
4 `./manage.py`, `./faraday.py` and `bin/flugin` are replaced for `faraday-server`, `faraday-manage`,
5 `faraday-client` and `fplugin` respectively
6 * Changed suggested installation method. Now we provide binary executables with all python dependencies
7 embedded into them
8 * Add admin panel to the Web UI to manage custom fields
9 * Fix slow host list when creating vulns in a workspace with many hosts
10 * Usability improvements in status report: change the way vulns are selected and confirmed
11 * Improve workspace workspace creation from the Web UI
12 * Fix attachment api when file was not found in .faraday/storage
13 * Fix visualization of the fields Policy Violations and References.
14 * Add a setting in server.ini to display the Vulnerability Cost widget of the Dashboard
15 * Fix status report resize when the browser console closes.
16 * Fix severity dropdown when creating vulnerability templates
17 * Update OS icons in the Web UI.
18 * Fix bug when using custom fields, we must use the field\_name instead of the display\_name
19 * Prevent creation of custom fields with the same name
20 * Add custom fields to vuln templates.
21 * Fix user's menu visibily when vuln detail is open
22 * Remove "show all" option in the status report pagination
23 * The activity feed widget of the dashboard now displays the hostname of the
24 machine that runned each command
25 * Add loading spinner in hosts report.
26 * Fix "invalid dsn" bug in sql-shell
27 * Fix hostnames bug in Nikto and Core Impact plugins
28 * Change Openvas plugin: Low and Debug threats are not taken as vulnerabilities.
29 * Add fplugin command to close vulns created after a certain time
30 * Add list-plugins command to faraday-manage to see all available plugins
31 * Fix a logging error in PluginBase class
32 * Fix an error when using NexposePlugin from command line.
33 * Add CSV parser to Dnsmap Plugin
34 * Fix bug when creating web vulnerabilities in dirb plugin
35 * Change Nexpose Severity Mappings.
+0
-36
CHANGELOG/3.8/white.md less more
0 * Refactor the project to use absolute imports to make the installation easier
1 (with a setup.py file). This also was a first step to make our codebase
2 compatible with python 3.
3 * Change the commands used to run faraday. `./faraday-server.py`,
4 `./manage.py`, `./faraday.py` and `bin/flugin` are replaced for `faraday-server`, `faraday-manage`,
5 `faraday-client` and `fplugin` respectively
6 * Changed suggested installation method. Now we provide binary executables with all python dependencies
7 embedded into them
8 * Add admin panel to the Web UI to manage custom fields
9 * Fix slow host list when creating vulns in a workspace with many hosts
10 * Usability improvements in status report: change the way vulns are selected and confirmed
11 * Improve workspace workspace creation from the Web UI
12 * Fix attachment api when file was not found in .faraday/storage
13 * Fix visualization of the fields Policy Violations and References.
14 * Add a setting in server.ini to display the Vulnerability Cost widget of the Dashboard
15 * Fix status report resize when the browser console closes.
16 * Fix severity dropdown when creating vulnerability templates
17 * Update OS icons in the Web UI.
18 * Fix bug when using custom fields, we must use the field\_name instead of the display\_name
19 * Prevent creation of custom fields with the same name
20 * Add custom fields to vuln templates.
21 * Fix user's menu visibily when vuln detail is open
22 * Remove "show all" option in the status report pagination
23 * The activity feed widget of the dashboard now displays the hostname of the
24 machine that runned each command
25 * Add loading spinner in hosts report.
26 * Fix "invalid dsn" bug in sql-shell
27 * Fix hostnames bug in Nikto and Core Impact plugins
28 * Change Openvas plugin: Low and Debug threats are not taken as vulnerabilities.
29 * Add fplugin command to close vulns created after a certain time
30 * Add list-plugins command to faraday-manage to see all available plugins
31 * Fix a logging error in PluginBase class
32 * Fix an error when using NexposePlugin from command line.
33 * Add CSV parser to Dnsmap Plugin
34 * Fix bug when creating web vulnerabilities in dirb plugin
35 * Change Nexpose Severity Mappings.
0 * Add configurations for websocket ssl
+0
-1
CHANGELOG/3.8.1/white.md less more
0 * Add configurations for websocket ssl
0 * Add agents feature for distributed plugin execution
1 * Add an API endpoint to to perform a bulk create of many objects (hosts,
2 services, vulns, commands and credentials). This is used to avoid doing a lot
3 of API requests to upload data. Now one request should be enough
4 * Major style and color changes to the Web UI
5 * Add API token authentication method
6 * Use server side stored sessions to properly invalidate cookies of logged out users
7 * Add "New" button to create credentials without host or service assigned yet
8 * Allow filtering hosts by its service's ports in the Web UI
9 * Performance improvements in vulnerabilities and vulnerability templates API (they
10 were doing a lot of SQL queries because of a programming bug)
11 * Require being in the faraday-manage group when running faraday from a .deb or .rpm package
12 * Change the first page shown after the user logs in. Now it displays a workspace
13 selection dialog
14 * Add API endpoint to import Vuln Templates from a CSV file
15 * Create the exported CSV of the status report in the backend instead of in the
16 problem, which was much slower
17 * Add API endpoint to import hosts from a CSV file
18 * Add `faraday-manage rename-user` command to change a user's username
19 * Allow resizing columns in Vulnerability Templates view
20 * Avoid copying technical details when a vuln template is generated from the status report
21 * Use exact matches when searching vulns by target
22 * Add API endpoint to get which tools impacted in a host
23 * Add pagination to activity feed
24 * Add ordering for date and creator to vuln templates view
25 * Modify tabs in vuln template, add Details tab
26 * Add copy IP to clipboard button in hosts view
27 * Add creator and create date columns to vuln template view
28 * When a plugin creates a host with its IP set to a domain name,
29 resolve the IP address of that domain
30 * Add support for logging in RFC5254 format
31 * Add active filter in workspaces view. Only show active workspaces
32 in other parts of the Web UI
33 * Enforce end date to be greater than start date in workspaces API
34 * Fix bug in `faraday-manage create-tables` that incorrectly marked schema
35 migrations as applied
36 * Fix bug in many plugins that loaded hostnames incorrectly (one hostname per chararcter)
37 * Improve references parsing in OpenVAS plugin
38 * Fix a bug in Nessus plugin when parsing reports without host\_start
39 * Fix bug hostname search is now working in status-report
40 * Fix showing of services with large names in the Web UI
41 * Fix broken select all hosts checkbox
42 * Fix bug viewing an attachment/evidence when its filename contained whitespaces
43 * Fix "Are you sure you want to quit Faraday?" dialog showing twice in GTK
+0
-44
CHANGELOG/3.9/white.md less more
0 * Add agents feature for distributed plugin execution
1 * Add an API endpoint to to perform a bulk create of many objects (hosts,
2 services, vulns, commands and credentials). This is used to avoid doing a lot
3 of API requests to upload data. Now one request should be enough
4 * Major style and color changes to the Web UI
5 * Add API token authentication method
6 * Use server side stored sessions to properly invalidate cookies of logged out users
7 * Add "New" button to create credentials without host or service assigned yet
8 * Allow filtering hosts by its service's ports in the Web UI
9 * Performance improvements in vulnerabilities and vulnerability templates API (they
10 were doing a lot of SQL queries because of a programming bug)
11 * Require being in the faraday-manage group when running faraday from a .deb or .rpm package
12 * Change the first page shown after the user logs in. Now it displays a workspace
13 selection dialog
14 * Add API endpoint to import Vuln Templates from a CSV file
15 * Create the exported CSV of the status report in the backend instead of in the
16 problem, which was much slower
17 * Add API endpoint to import hosts from a CSV file
18 * Add `faraday-manage rename-user` command to change a user's username
19 * Allow resizing columns in Vulnerability Templates view
20 * Avoid copying technical details when a vuln template is generated from the status report
21 * Use exact matches when searching vulns by target
22 * Add API endpoint to get which tools impacted in a host
23 * Add pagination to activity feed
24 * Add ordering for date and creator to vuln templates view
25 * Modify tabs in vuln template, add Details tab
26 * Add copy IP to clipboard button in hosts view
27 * Add creator and create date columns to vuln template view
28 * When a plugin creates a host with its IP set to a domain name,
29 resolve the IP address of that domain
30 * Add support for logging in RFC5254 format
31 * Add active filter in workspaces view. Only show active workspaces
32 in other parts of the Web UI
33 * Enforce end date to be greater than start date in workspaces API
34 * Fix bug in `faraday-manage create-tables` that incorrectly marked schema
35 migrations as applied
36 * Fix bug in many plugins that loaded hostnames incorrectly (one hostname per chararcter)
37 * Improve references parsing in OpenVAS plugin
38 * Fix a bug in Nessus plugin when parsing reports without host\_start
39 * Fix bug hostname search is now working in status-report
40 * Fix showing of services with large names in the Web UI
41 * Fix broken select all hosts checkbox
42 * Fix bug viewing an attachment/evidence when its filename contained whitespaces
43 * Fix "Are you sure you want to quit Faraday?" dialog showing twice in GTK
0 * Fix unicode error when exporting vulns to CSV
1 * Add vuln attributes to CSV
2 * Fix hostname parsing and add external ID to Qualys plugin
+0
-3
CHANGELOG/3.9.3/white.md less more
0 * Fix unicode error when exporting vulns to CSV
1 * Add vuln attributes to CSV
2 * Fix hostname parsing and add external ID to Qualys plugin
11 =====================================
22
33
4 3.15.0 [May 18th, 2021]:
5 ---
6
7 * ADD `Basic Auth` support
8 * ADD support for GET method in websocket_tokens, POST will be deprecated in the future
9 * ADD CVSS(String), CWE(String), CVE(relationship) columns to vulnerability model and API
10 * ADD agent token's API says the renewal cycling duration
11 * MOD Improve database model to be able to delete workspaces fastly
12 * MOD Improve code style and uses (less flake8 exceptions, py3 `super` style, Flask app as singleton, etc)
13 * MOD workspaces' names regex to verify they cannot contain forward slash (`/`)
14 * MOD Improve bulk create logs
15 * FIX Own schema breaking Marshmallow 3.11.0+
16 * UPD flask_security_too to version 4.0.0+
17
418 3.14.4 [Apr 15th, 2021]:
519 ---
620 * Updated plugins package, which update appscan plugin
7
821
922 3.14.3 [Mar 30th, 2021]:
1023 ---
107120 * Cleanup old sessions when a user logs in
108121 * Remove unmaintained Flask-Restless dependency
109122 * Remove pbkdf2\_sha1 and plain password schemes. We only support bcrypt
123
124 3.11.2:
125 ---
110126
111127 3.11.1 [Jun 3rd, 2020]:
112128 ---
310326 * Fix user's menu visibily when vuln detail is open.
311327 * Fix bug in status report that incorrectly showed standard vulns like if they were vulnwebs
312328
313 3.7.2:
314 ---
315
316329 3.7:
317330 ---
318331 * Add vulnerability preview to status report
0 import json
1 from pathlib import Path
2 from typing import Dict
03
1 import os
24 import packaging.version
35
4 LEVEL = "white"
6 LEVEL = "community"
7 LEVELS = ["community", "prof", "corp"] if LEVEL == "corp" else ["community"]
8 MD_FILES = ["community.md", "prof.md", "corp.md", "date.md"] if LEVEL == "corp" else ["community.md", "date.md"]
59
6 def match(elem):
10
11 def match(elem: str):
712 try:
813 ans = packaging.version.Version(elem)
914 except packaging.version.InvalidVersion as e:
1116 return False
1217 return ans
1318
14 IGNORED_FILES = ["white.md", "pink.md", "black.md", "date.md"]
1519
16 def addFile(filename,changelog_file,to=None):
17 with open(filename, "r") as date_file:
18 if to:
19 changelog_file.write(date_file.readline()[:to])
20 else:
21 changelog_file.writelines(date_file.readlines())
20 def add_md_file(filename, changelog_file):
21 with filename.open("r") as date_file:
22 changelog_file.write(date_file.readline()[:-1])
2223
23 def main(level):
2424
25 ls_ans = os.listdir(".")
26 folders = list(sorted(filter(lambda el: el, map(lambda elem: match(elem),ls_ans)),reverse=True))
27 with open("RELEASE.md","w") as changelog_file:
25 def get_md_text_from_json_file(filepath: Path, level_dict):
26 with filepath.open("r") as file:
27 file_json: Dict = json.loads(file.read())
28 level = file_json.get("level")
29 level_dict[level] += f" * {file_json.get('md')}\n"
30
31
32 def main():
33 ls_ans = [path.name for path in Path(__file__).parent.iterdir()]
34 folders = list(sorted(filter(lambda el: el, map(lambda elem: match(elem), ls_ans)), reverse=True))
35 with (Path(__file__).parent / "RELEASE.md").open("w") as changelog_file:
2836 if "header.md" in ls_ans:
29 with open("header.md", "r") as header_file:
37 with (Path(__file__).parent / "header.md").open("r") as header_file:
3038 changelog_file.writelines(header_file.readlines())
3139 changelog_file.writelines("\n\n")
3240 for folder in folders:
3341 changelog_file.write(str(folder))
34 inner_files = list(filter(lambda elem: elem.endswith(".md") ,os.listdir("./" + str(folder))))
35 if "date.md" in inner_files:
42 inner_files = list(filter(lambda elem: elem.suffix == ".json" or elem.name in MD_FILES,
43 (Path(__file__).parent / str(folder)).iterdir()))
44 if any([file.name == "date.md" for file in inner_files]):
3645 changelog_file.write(" [")
37 addFile("./" + str(folder) + "/date.md",changelog_file,-1)
46 add_md_file(Path(__file__).parent / str(folder) / "date.md", changelog_file)
3847 changelog_file.write("]")
3948 changelog_file.writelines(":\n---\n")
40 if level != "white":
41 addFile("./" + str(folder) + "/white.md",changelog_file)
42 if level == "black":
43 addFile("./" + str(folder) + "/pink.md",changelog_file)
44 level_filename = "./" + str(folder) + "/" + level + ".md"
4549
46 previous = [""]
47 if level + ".md" in os.listdir("./" + str(folder)):
48 with open(level_filename, "r") as level_file:
49 previous = level_file.readlines()
50
51 with open(level_filename, "w") as level_file:
52 level_file.writelines(previous)
53 for inner_file_name in inner_files:
54 if inner_file_name not in IGNORED_FILES:
55 level_file.write(" * ")
56 addFile("./" + str(folder) + "/" + inner_file_name, level_file)
57 level_file.write("\n")
58 os.remove("./" + str(folder) + "/" + inner_file_name)
59 addFile(level_filename, changelog_file)
50 level_dicts = {level: "" for level in LEVELS}
51 for level in LEVELS:
52 if any([file.name == f"{level}.md" for file in inner_files]):
53 with (Path(__file__).parent / str(folder) / f"{level}.md").open("r") as level_file:
54 level_dicts[level] = level_file.read()
55 for inner_file in filter(lambda elem: elem.suffix == ".json", inner_files):
56 get_md_text_from_json_file(inner_file, level_dicts)
57 inner_file.unlink()
58 for level in LEVELS:
59 with (Path(__file__).parent / str(folder) / f"{level}.md").open("w") as level_file:
60 level_file.write(level_dicts[level])
61 changelog_file.write(level_dicts[level])
6062 changelog_file.writelines("\n")
6163
6264 if "footer.md" in ls_ans:
63 with open("footer.md", "r") as footer_file:
65 with (Path(__file__).parent / "footer.md").open("r") as footer_file:
6466 changelog_file.writelines(footer_file.readlines())
6567
6668
6769 if __name__ == '__main__':
68 level = LEVEL # if not level_passed else level_pased
69 main(level)
70
71 # I'm Py3
70 main()
1414 assert file == "keep", file
1515 version_folder = changelog_folder / version_id
1616 for file in os.listdir(version_folder):
17 assert file in ["date.md", "white.md", "pink.md", "black.md"], file
17 assert file in ["date.md", "community.md", "prof.md", "corp.md"], file
1818
1919
2020 if __name__ == '__main__':
2929 $ source faraday_env/bin/activate
3030 $ git clone [email protected]:infobyte/faraday.git
3131 $ cd faraday
32 $ git clone [email protected]:infobyte/faraday-angular.git faraday/frontend
32 $ git clone https://github.com/infobyte/faraday_angular_frontend.git faraday/frontend
3333 $ pip install .
3434 ```
3535
11 =====================================
22
33
4 3.15.0 [May 18th, 2021]:
5 ---
6
7 * ADD `Basic Auth` support
8 * ADD support for GET method in websocket_tokens, POST will be deprecated in the future
9 * ADD CVSS(String), CWE(String), CVE(relationship) columns to vulnerability model and API
10 * ADD agent token's API says the renewal cycling duration
11 * MOD Improve database model to be able to delete workspaces fastly
12 * MOD Improve code style and uses (less flake8 exceptions, py3 `super` style, Flask app as singleton, etc)
13 * MOD workspaces' names regex to verify they cannot contain forward slash (`/`)
14 * MOD Improve bulk create logs
15 * FIX Own schema breaking Marshmallow 3.11.0+
16 * UPD flask_security_too to version 4.0.0+
17
418 3.14.4 [Apr 15th, 2021]:
519 ---
620 * Updated plugins package, which update appscan plugin
7
821
922 3.14.3 [Mar 30th, 2021]:
1023 ---
107120 * Cleanup old sessions when a user logs in
108121 * Remove unmaintained Flask-Restless dependency
109122 * Remove pbkdf2\_sha1 and plain password schemes. We only support bcrypt
123
124 3.11.2:
125 ---
110126
111127 3.11.1 [Jun 3rd, 2020]:
112128 ---
310326 * Fix user's menu visibily when vuln detail is open.
311327 * Fix bug in status report that incorrectly showed standard vulns like if they were vulnwebs
312328
313 3.7.2:
314 ---
315
316329 3.7:
317330 ---
318331 * Add vulnerability preview to status report
11 # Copyright (C) 2013 Infobyte LLC (http://www.infobytesec.com/)
22 # See the file 'doc/LICENSE' for the license information
33
4 __version__ = '3.14.4'
4 __version__ = '3.15.0'
55 __license_version__ = __version__
99 import sys
1010 import platform
1111 import logging
12
1213 # If is linux and its installed with deb or rpm, it must run with a user in the faraday group
1314 if platform.system() == "Linux":
1415 import grp
1516 from getpass import getuser
17
1618 try:
1719 FARADAY_GROUP = "faraday"
1820 faraday_group = grp.getgrnam(FARADAY_GROUP)
19 #The current user may be different from the logged user
21 # The current user may be different from the logged user
2022 current_user = getuser()
2123 if current_user != 'root' and faraday_group.gr_gid not in os.getgroups():
2224 print(f"\n\nUser ({os.getlogin()}) must be in the '{FARADAY_GROUP}' group.")
4749 from faraday.server.commands import nginx_config
4850 from faraday.server.commands import import_vulnerability_template
4951 from faraday.server.models import db, User
50 from faraday.server.web import app
52 from faraday.server.web import get_app
5153 from faraday_plugins.plugins.manager import PluginsManager
5254 from flask_security.utils import hash_password
5355
54
5556 CONTEXT_SETTINGS = dict(help_option_names=['-h', '--help'])
5657
57 #logger = logging.getLogger(__name__)
58
59 # logger = logging.getLogger(__name__)
60
5861
5962 @click.group(context_settings=CONTEXT_SETTINGS)
6063 def cli():
7982 openapi_format(format="yaml", server=server, no_servers=no_servers)
8083
8184
82
8385 @click.command(help="Import Vulnerability templates")
8486 @click.option('--language', required=False, default='en')
8587 @click.option('--list-languages', is_flag=True)
8991
9092 @click.command(help="Create Faraday DB in Postgresql, also tables and indexes")
9193 @click.option(
92 '--choose-password', is_flag=True, default=False,
93 help=('Instead of using a random password for the user "faraday", '
94 'ask for the desired one')
95 )
94 '--choose-password', is_flag=True, default=False,
95 help=('Instead of using a random password for the user "faraday", '
96 'ask for the desired one')
97 )
9698 @click.option(
97 '--password', type=str, default=False,
98 help=('Instead of using a random password for the user "faraday", '
99 'use the one provided')
100 )
99 '--password', type=str, default=False,
100 help=('Instead of using a random password for the user "faraday", '
101 'use the one provided')
102 )
101103 def initdb(choose_password, password):
102 with app.app_context():
104 with get_app().app_context():
103105 InitDB().run(choose_password=choose_password, faraday_user_password=password)
104106
105107
128130 @click.option('--check_dependencies', default=False, is_flag=True)
129131 @click.option('--check_config', default=False, is_flag=True)
130132 def status_check(check_postgresql, check_faraday, check_dependencies, check_config):
131
132133 selected = False
133134 exit_code = 0
134135 if check_postgresql:
167168
168169
169170 def validate_user_unique_field(ctx, param, value):
170 with app.app_context():
171 with get_app().app_context():
171172 try:
172173 if User.query.filter_by(**{param.name: value}).count():
173174 raise click.ClickException("User already exists")
189190 # Also validate that the email doesn't exist in the database
190191 return validate_user_unique_field(ctx, param, value)
191192
193
192194 @click.command(help="List Available Plugins")
193195 def list_plugins():
194196 plugins_manager = PluginsManager()
195197 for _, plugin in plugins_manager.get_plugins():
196198 click.echo(f"{plugin.id}")
199
197200
198201 @click.command(help="Create ADMIN user for Faraday application")
199202 @click.option('--username', prompt=True, callback=validate_user_unique_field)
201204 @click.option('--password', prompt=True, hide_input=True,
202205 confirmation_prompt=True)
203206 def create_superuser(username, email, password):
204 with app.app_context():
207 with get_app().app_context():
205208 if db.session.query(User).filter_by(active=True).count() > 0:
206 print("Can't create more users. The comumunity edition only allows one user. Please contact support for further information.")
209 print(
210 "Can't create more users. The comumunity edition only allows one user. Please contact support for further information.")
207211 sys.exit(1)
208212
209 app.user_datastore.create_user(username=username,
213 get_app().user_datastore.create_user(username=username,
210214 email=email,
211215 password=hash_password(password),
212216 role='admin',
218222
219223
220224 @click.command(help="Create database tables. Requires a functional "
221 "PostgreSQL database configured in the server.ini")
225 "PostgreSQL database configured in the server.ini")
222226 def create_tables():
223 with app.app_context():
227 with get_app().app_context():
224228 # Ugly hack to create tables and also setting alembic revision
225229 conn_string = faraday.server.config.database.connection_string
226230 if not conn_string:
236240 'Tables created successfully!',
237241 fg='green', bold=True))
238242
243
239244 @click.command(help="Generates a .zip file with technical information")
240245 def support():
241246 support_zip.all_for_support()
242247
243248
244249 @click.command(
245 context_settings={"ignore_unknown_options": True},
246 help='Migrates database schema. If the target revision '
247 'is not specified, use "head" when upgrading and "-1" when '
248 'downgrading')
250 context_settings={"ignore_unknown_options": True},
251 help='Migrates database schema. If the target revision '
252 'is not specified, use "head" when upgrading and "-1" when '
253 'downgrading')
249254 @click.option(
250 '--downgrade',
251 help="Perform a downgrade migration instead of an upgrade one",
252 is_flag=True)
255 '--downgrade',
256 help="Perform a downgrade migration instead of an upgrade one",
257 is_flag=True)
253258 @click.argument(
254 'revision',
255 required=False,
256 )
259 'revision',
260 required=False,
261 )
257262 def migrate(downgrade, revision):
258263 try:
259264 revision = revision or ("-1" if downgrade else "head")
290295 @click.option('--current_username', required=True, prompt=True)
291296 @click.option('--new_username', required=True, prompt=True)
292297 def rename_user(current_username, new_username):
293 if(current_username == new_username):
298 if (current_username == new_username):
294299 print("\nERROR: Usernames must be different.")
295300 sys.exit(1)
296301 else:
297302 change_username.change_username(current_username, new_username)
303
298304
299305 @click.command(help="Generate nginx config")
300306 @click.option('--fqdn', prompt='Server FQDN', help='The FQDN of your faraday server', type=str, show_default=True)
301307 @click.option('--port', prompt='Faraday port', help='Faraday listening port', type=int, default=5985)
302308 @click.option('--ws-port', prompt='Faraday Websocket port', help='Faraday websocket listening port', type=int,
303309 default=9000, show_default=True)
304 @click.option('--ssl-certificate', prompt='SSL Certificate Path', help='SSL Certificate Path', type=click.Path(exists=True))
310 @click.option('--ssl-certificate', prompt='SSL Certificate Path', help='SSL Certificate Path',
311 type=click.Path(exists=True))
305312 @click.option('--ssl-key', prompt='SSL Key Path', help='SSL Key Path', type=click.Path(exists=True))
306313 @click.option('--multitenant-url', help='URL for multitenant config', type=str)
307314 def generate_nginx_config(fqdn, port, ws_port, ssl_certificate, ssl_key, multitenant_url):
308315 nginx_config.generate_nginx_config(fqdn, port, ws_port, ssl_certificate, ssl_key, multitenant_url)
316
309317
310318 cli.add_command(show_urls)
311319 cli.add_command(initdb)
325333 cli.add_command(generate_nginx_config)
326334 cli.add_command(import_vulnerability_templates)
327335
328
329336 if __name__ == '__main__':
330
331337 cli()
332338
333
334339 # I'm Py3
00
11 import logging
22 import faraday.server.config
3 from faraday.server.web import app
3 from faraday.server.web import get_app
44 from faraday.server.models import db
55
66 from alembic import context
7 from sqlalchemy import engine_from_config, pool
87 from logging.config import fileConfig
98
109 # this is the Alembic Config object, which provides
6160 and associate a connection with the context.
6261
6362 """
64 with app.app_context():
63 with get_app().app_context():
6564 connectable = db.engine
6665
6766 with connectable.connect() as connection:
6867 context.configure(
6968 connection=connection,
70 target_metadata=target_metadata
69 target_metadata=target_metadata,
70 compare_type=True
7171 )
7272
7373 with context.begin_transaction():
7474 context.run_migrations()
75
7576
7677 if context.is_offline_mode():
7778 run_migrations_offline()
66 """
77 from alembic import op
88 import sqlalchemy as sa
9
109
1110 # revision identifiers, used by Alembic.
1211 revision = '0d216660da28'
2423 'executive_report',
2524 'workspace',
2625 'task'
27 )
26 )
2827
2928
3029 def upgrade():
4342 # the syntax os the sql is invalid for postgresql and it also tries to
4443 # create the enum when it already exists.
4544 op.add_column('notification',
46 sa.Column(
47 'object_type',
48 sa.Enum(OBJECT_TYPES, name='object_types'),
49 nullable=False
50 )
51 )
52
45 sa.Column(
46 'object_type',
47 sa.Enum(OBJECT_TYPES, name='object_types'),
48 nullable=False
49 )
50 )
5351
5452 op.create_foreign_key(
5553 'notification_user_id_fkey',
6664
6765 def downgrade():
6866 op.drop_table('notification')
69 #op.drop_constraint(None, 'notification_user_id_fkey', type_='foreignkey')
70 #op.drop_constraint(None, 'notification_workspace_id_fkey', type_='foreignkey')
67 # op.drop_constraint(None, 'notification_user_id_fkey', type_='foreignkey')
68 # op.drop_constraint(None, 'notification_workspace_id_fkey', type_='foreignkey')
7169 # I'm Py3
0 """add cascade delete from workspace
1
2 Revision ID: 18891ca61db6
3 Revises: aa56852fa76d
4 Create Date: 2021-04-08 12:09:04.182543+00:00
5
6 """
7 from alembic import op
8
9 # revision identifiers, used by Alembic.
10 revision = '18891ca61db6'
11 down_revision = 'aa56852fa76d'
12 branch_labels = None
13 depends_on = None
14
15
16 def upgrade():
17 # ### commands auto generated by Alembic - please adjust! ###
18 op.drop_constraint('credential_workspace_id_fkey', 'credential', type_='foreignkey')
19 op.create_foreign_key('credential_workspace_id_fkey', 'credential', 'workspace', ['workspace_id'], ['id'], ondelete='CASCADE')
20 op.drop_constraint('host_workspace_id_fkey', 'host', type_='foreignkey')
21 op.create_foreign_key('host_workspace_id_fkey', 'host', 'workspace', ['workspace_id'], ['id'], ondelete='CASCADE')
22 op.drop_constraint('hostname_workspace_id_fkey', 'hostname', type_='foreignkey')
23 op.create_foreign_key('hostname_workspace_id_fkey', 'hostname', 'workspace', ['workspace_id'], ['id'], ondelete='CASCADE')
24 op.drop_constraint('rule_action_uc', 'rule_action', type_='unique')
25 op.drop_constraint('service_workspace_id_fkey', 'service', type_='foreignkey')
26 op.create_foreign_key('service_workspace_id_fkey', 'service', 'workspace', ['workspace_id'], ['id'], ondelete='CASCADE')
27 op.drop_constraint('vulnerability_workspace_id_fkey', 'vulnerability', type_='foreignkey')
28 op.create_foreign_key('vulnerability_workspace_id_fkey', 'vulnerability', 'workspace', ['workspace_id'], ['id'], ondelete='CASCADE')
29 # ### end Alembic commands ###
30
31
32 def downgrade():
33 # ### commands auto generated by Alembic - please adjust! ###
34 op.drop_constraint('vulnerability_workspace_id_fkey', 'vulnerability', type_='foreignkey')
35 op.create_foreign_key('vulnerability_workspace_id_fkey', 'vulnerability', 'workspace', ['workspace_id'], ['id'])
36 op.drop_constraint('service_workspace_id_fkey', 'service', type_='foreignkey')
37 op.create_foreign_key('service_workspace_id_fkey', 'service', 'workspace', ['workspace_id'], ['id'])
38 op.create_unique_constraint('rule_action_uc', 'rule_action', ['rule_id', 'action_id'])
39 op.drop_constraint('hostname_workspace_id_fkey', 'hostname', type_='foreignkey')
40 op.create_foreign_key('hostname_workspace_id_fkey', 'hostname', 'workspace', ['workspace_id'], ['id'])
41 op.drop_constraint('host_workspace_id_fkey', 'host', type_='foreignkey')
42 op.create_foreign_key('host_workspace_id_fkey', 'host', 'workspace', ['workspace_id'], ['id'])
43 op.drop_constraint('credential_workspace_id_fkey', 'credential', type_='foreignkey')
44 op.create_foreign_key('credential_workspace_id_fkey', 'credential', 'workspace', ['workspace_id'], ['id'])
45 # ### end Alembic commands ###
66 """
77 import json
88 from alembic import op
9 import sqlalchemy as sa
109 from sqlalchemy.sql import text
1110
1211
77 from alembic import op
88 import sqlalchemy as sa
99
10
1110 # revision identifiers, used by Alembic.
1211 revision = '1dbe9e8e4247'
1312 down_revision = 'f8a44acd0e41'
1716
1817 def upgrade():
1918 op.add_column('rule_execution',
20 sa.Column(
21 'start',
22 sa.DateTime(), nullable=True
23 )
24 )
19 sa.Column(
20 'start',
21 sa.DateTime(), nullable=True
22 )
23 )
2524 op.add_column('rule_execution',
26 sa.Column(
27 'end',
28 sa.DateTime(), nullable=True
29 )
30 )
25 sa.Column(
26 'end',
27 sa.DateTime(), nullable=True
28 )
29 )
3130
3231
3332 def downgrade():
34 op.drop_column('rule_execution','start')
35 op.drop_column('rule_execution','end')
33 op.drop_column('rule_execution', 'start')
34 op.drop_column('rule_execution', 'end')
66 """
77 from alembic import op
88 import sqlalchemy as sa
9
109
1110 # revision identifiers, used by Alembic.
1211 revision = '20f3d0c2f71f'
2019 sa.Column(
2120 'advanced_filter_parsed',
2221 sa.String(255),
23 nullable = False,
24 server_default = ""
22 nullable=False,
23 server_default=""
2524 )
26 )
25 )
2726
2827
2928 def downgrade():
30 op.drop_column('executive_report','advanced_filter_parsed')
29 op.drop_column('executive_report', 'advanced_filter_parsed')
55
66 """
77 from alembic import op
8 import sqlalchemy as sa
98
109
1110 # revision identifiers, used by Alembic.
77 from alembic import op
88 import sqlalchemy as sa
99
10
1110 # revision identifiers, used by Alembic.
1211 revision = '526aa91cac98'
1312 down_revision = '085188e0a016'
1413 branch_labels = None
1514 depends_on = None
1615
17 #TODO Verificar que se borran las fk
16
17 # TODO Verificar que se borran las fk
18
1819
1920 def upgrade():
2021 conn = op.get_bind()
2930 sa.Column('tool_name', sa.Text),
3031 sa.Column('false_positive', sa.Integer, nullable=False, default=0),
3132 sa.Column('verified', sa.Integer, nullable=False, default=0),
32 sa.UniqueConstraint('external_identifier', 'tool_name', 'reference_id', name='uix_externalidentifier_toolname_referenceid')
33 )
33 sa.UniqueConstraint('external_identifier', 'tool_name', 'reference_id',
34 name='uix_externalidentifier_toolname_referenceid')
35 )
3436
3537 op.create_foreign_key(
3638 'knowledge_base_vulnerability_template_id_fkey', 'knowledge_base',
3840 )
3941
4042 op.add_column('vulnerability',
41 sa.Column(
42 'association_date',
43 sa.DateTime(), nullable=True
44 )
45 )
43 sa.Column(
44 'association_date',
45 sa.DateTime(), nullable=True
46 )
47 )
4648
4749 op.add_column('vulnerability',
48 sa.Column(
49 'vulnerability_template_id',
50 sa.Integer(),
51 nullable=True
52 )
53 )
50 sa.Column(
51 'vulnerability_template_id',
52 sa.Integer(),
53 nullable=True
54 )
55 )
5456
5557 op.create_foreign_key(
5658 'vulnerability_vulnerability_template_id_fkey',
6062 )
6163
6264 op.add_column('vulnerability',
63 sa.Column(
64 'vulnerability_duplicate_id',
65 sa.Integer(),
66 nullable=True
67 )
68 )
65 sa.Column(
66 'vulnerability_duplicate_id',
67 sa.Integer(),
68 nullable=True
69 )
70 )
6971
7072 op.add_column('vulnerability',
71 sa.Column(
72 'disassociated_manually',
73 sa.Boolean(),
74 nullable=False,
75 server_default='false',
76 )
77 )
73 sa.Column(
74 'disassociated_manually',
75 sa.Boolean(),
76 nullable=False,
77 server_default='false',
78 )
79 )
7880
7981 op.create_foreign_key(
8082 'vulnerability_vulnerability_duplicate_id_fkey',
8385 )
8486
8587 op.add_column('vulnerability_template',
86 sa.Column(
87 'shipped',
88 sa.Boolean(),
89 nullable=False,
90 server_default='false',
91 )
92 )
88 sa.Column(
89 'shipped',
90 sa.Boolean(),
91 nullable=False,
92 server_default='false',
93 )
94 )
9395
9496 conn.execute('ALTER TABLE vulnerability_template DROP CONSTRAINT uix_vulnerability_template_name')
95 conn.execute('ALTER TABLE vulnerability_template ADD CONSTRAINT uix_vulnerability_template_name UNIQUE (name, shipped)')
97 conn.execute(
98 'ALTER TABLE vulnerability_template ADD CONSTRAINT uix_vulnerability_template_name UNIQUE (name, shipped)')
99
96100
97101 def downgrade():
98
99102 conn = op.get_bind()
100103
104 conn.execute('ALTER TABLE vulnerability_template DROP CONSTRAINT uix_vulnerability_template_name')
105 op.drop_column('vulnerability_template', 'shipped')
106 conn.execute('ALTER TABLE vulnerability_template ADD CONSTRAINT uix_vulnerability_template_name UNIQUE (name)')
101107
102 conn.execute('ALTER TABLE vulnerability_template DROP CONSTRAINT uix_vulnerability_template_name')
103 op.drop_column('vulnerability_template','shipped')
104 conn.execute('ALTER TABLE vulnerability_template ADD CONSTRAINT uix_vulnerability_template_name UNIQUE (name)')
105
106108 op.drop_table('knowledge_base')
107109
108 op.drop_column('vulnerability','vulnerability_duplicate_id')
109 op.drop_column('vulnerability','vulnerability_template_id')
110 op.drop_column('vulnerability','association_date')
111 op.drop_column('vulnerability','disassociated_manually')
110 op.drop_column('vulnerability', 'vulnerability_duplicate_id')
111 op.drop_column('vulnerability', 'vulnerability_template_id')
112 op.drop_column('vulnerability', 'association_date')
113 op.drop_column('vulnerability', 'disassociated_manually')
66 """
77
88 from alembic import op
9 import sqlalchemy as sa
109
1110
1211 # revision identifiers, used by Alembic.
55
66 """
77 from alembic import op
8 import sqlalchemy as sa
98
109
1110 # revision identifiers, used by Alembic.
77 from alembic import op
88 import sqlalchemy as sa
99
10
1110 # revision identifiers, used by Alembic.
1211 revision = '84f266a05be3'
1312 down_revision = 'a39a3a6e3f99'
1716
1817 def upgrade():
1918 op.add_column('vulnerability', sa.Column(
20 'tool',
21 sa.Text(),
22 nullable=False,
23 server_default=""
24 )
19 'tool',
20 sa.Text(),
21 nullable=False,
22 server_default=""
2523 )
24 )
2625 conn = op.get_bind()
2726 conn.execute("""UPDATE vulnerability
28 SET tool=SUBQUERY.tool
27 SET tool=SUBQUERY.tool
2928 FROM (select v.id, c.tool from vulnerability v, command_object co, command c where v.id = co.object_id and co.object_type = 'vulnerability' and co.command_id = c.id) AS SUBQUERY
3029 WHERE vulnerability.id=SUBQUERY.id""")
3130 conn.execute("UPDATE vulnerability set tool='Web UI' where tool=''")
32
31
3332
3433 def downgrade():
35 op.drop_column('vulnerability','tool')
36
34 op.drop_column('vulnerability', 'tool')
55
66 """
77
8 import uuid
98 from alembic import op
109 import sqlalchemy as sa
11
12 # revision identifiers, used by Alembic.
13 from sqlalchemy.dialects import postgresql
1410
1511 revision = '9c4091d1a09b'
1612 down_revision = 'be89aa03e35e'
0 """update rule fields
1
2 Revision ID: aa56852fa76d
3 Revises: f0a507afabd4
4 Create Date: 2021-04-12 19:53:48.615218+00:00
5
6 """
7 from alembic import op
8 import sqlalchemy as sa
9 from sqlalchemy.sql import text
10
11 # revision identifiers, used by Alembic.
12 revision = 'aa56852fa76d'
13 down_revision = 'f0a507afabd4'
14 branch_labels = None
15 depends_on = None
16
17
18 def constraint_exists(constraint_name):
19 connection = op.get_bind()
20 result = connection.execute(
21 text("""
22 SELECT exists(
23 SELECT 1
24 from pg_catalog.pg_constraint
25 where conname = :constraint_name
26 ) as exists """
27 ), **{
28 'constraint_name': constraint_name,
29 }
30 ).first()
31
32 return result.exists
33
34
35 def column_exists(table_name, column_name):
36 connection = op.get_bind()
37 result = connection.execute(
38 text("""
39 SELECT exists(
40 SELECT 1
41 FROM information_schema.columns
42 WHERE table_name = :table_name
43 AND column_name = :column_name
44 ) as exists """
45 ), **{
46 'table_name': table_name,
47 'column_name': column_name,
48 }
49 ).first()
50 return result.exists
51
52
53 def upgrade():
54 # ### commands auto generated by Alembic - please adjust! ###
55 if not column_exists('action', 'description'):
56 op.add_column('action', sa.Column('description', sa.String(), nullable=False))
57 op.drop_constraint('condition_rule_id_fkey', 'condition', type_='foreignkey')
58 op.create_foreign_key('condition_rule_id_fkey', 'condition', 'rule', ['rule_id'], ['id'], ondelete='CASCADE')
59 if not column_exists('rule', 'description'):
60 op.add_column('rule', sa.Column('description', sa.String(), nullable=False))
61 if not column_exists('rule', 'name'):
62 op.add_column('rule', sa.Column('name', sa.String(), nullable=False))
63 if not constraint_exists('ux_rule_name'):
64 op.create_unique_constraint('ux_rule_name', 'rule', ['name'])
65 if column_exists('rule', 'object'):
66 op.drop_column('rule', 'object')
67 if not constraint_exists('rule_action_uc'):
68 op.create_unique_constraint('rule_action_uc', 'rule_action', ['rule_id', 'action_id'])
69 # ### end Alembic commands ###
70
71
72 def downgrade():
73 # ### commands auto generated by Alembic - please adjust! ###
74
75 # several items could have been created 'conditionally' by upgrade()
76 # so at this point their presence doesn't tell whether they was created by this script or not
77 # therefore don't do anything to reverse them
78 # a reversion operation does not apply if items weren't created or deleted by this script
79
80 # op.drop_constraint('rule_action_uc', 'rule_action', type_='unique')
81 # op.add_column('rule', sa.Column('object', postgresql.JSONB(astext_type=sa.Text()), autoincrement=False, nullable=False))
82 # op.drop_constraint('ux_rule_name', 'rule', type_='unique')
83 # op.drop_column('rule', 'name')
84 # op.drop_column('rule', 'description')
85 op.drop_constraint('condition_rule_id_fkey', 'condition', type_='foreignkey')
86 op.create_foreign_key('condition_rule_id_fkey', 'condition', 'rule', ['rule_id'], ['id'])
87 # op.drop_column('action', 'description')
88 # ### end Alembic commands ###
44 Create Date: 2020-04-02 20:41:41.083048+00:00
55
66 """
7 from alembic import op
8 import sqlalchemy as sa
97
108 from faraday.server.config import LOCAL_CONFIG_FILE
119 from configparser import ConfigParser, NoSectionError
66 """
77
88 from alembic import op
9 import sqlalchemy as sa
109
1110
1211 # revision identifiers, used by Alembic.
2120 conn.execute('ALTER TABLE executive_report ADD COLUMN filter JSONB')
2221
2322
24
2523 def downgrade():
2624 conn = op.get_bind()
2725 conn.execute('ALTER TABLE executive_report DROP COLUMN filter')
55
66 """
77 from alembic import op
8 import sqlalchemy as sa
98
109
1110 # revision identifiers, used by Alembic.
1918 conn = op.get_bind()
2019 conn.execute('ALTER TABLE vulnerability ADD COLUMN custom_fields JSONB')
2120 conn.execute('ALTER TABLE vulnerability_template ADD COLUMN custom_fields JSONB')
22 conn.execute('CREATE TABLE custom_fields_schema ( '\
23 'id SERIAL PRIMARY KEY,' \
24 'table_name TEXT,' \
25 'field_name TEXT,' \
26 'field_type TEXT,' \
27 'field_order INTEGER,' \
28 'field_display_name TEXT)'
29 )
21 conn.execute('CREATE TABLE custom_fields_schema ( '
22 'id SERIAL PRIMARY KEY,'
23 'table_name TEXT,'
24 'field_name TEXT,'
25 'field_type TEXT,'
26 'field_order INTEGER,'
27 'field_display_name TEXT)'
28 )
29
3030
3131 def downgrade():
3232 conn = op.get_bind()
0 """adding fs uniquifier to user model
1
2 Revision ID: f0a507afabd4
3 Revises: a4def820a5bb
4 Create Date: 2021-02-24 22:08:24.237037+00:00
5
6 """
7 from alembic import op
8 import sqlalchemy as sa
9
10
11 # revision identifiers, used by Alembic.
12 revision = 'f0a507afabd4'
13 down_revision = 'a4def820a5bb'
14 branch_labels = None
15 depends_on = None
16
17
18 def upgrade():
19 # be sure to MODIFY this line to make nullable=True:
20 op.add_column('faraday_user', sa.Column('fs_uniquifier', sa.String(length=64), nullable=True))
21
22 # update existing rows with unique fs_uniquifier
23 import uuid
24 user_table = sa.Table('faraday_user', sa.MetaData(), sa.Column('id', sa.Integer, primary_key=True),
25 sa.Column('fs_uniquifier', sa.String))
26 conn = op.get_bind()
27 for row in conn.execute(sa.select([user_table.c.id])):
28 conn.execute(user_table.update().values(fs_uniquifier=uuid.uuid4().hex).where(user_table.c.id == row['id']))
29
30 # finally - set nullable to false
31 op.alter_column('faraday_user', 'fs_uniquifier', nullable=False)
32
33
34 def downgrade():
35 op.drop_column(
36 'faraday_user',
37 'fs_uniquifier',
38 )
66 """
77 from alembic import op
88 import sqlalchemy as sa
9 from faraday.server.fields import JSONType
10 from depot.fields.sqlalchemy import UploadedFileField
119 from sqlalchemy.dialects import postgresql
1210
1311 # revision identifiers, used by Alembic.
00 ###
1 ## Faraday Penetration Test IDE
2 ## Copyright (C) 2018 Infobyte LLC (http://www.infobytesec.com/)
3 ## See the file 'doc/LICENSE' for the license information
1 # Faraday Penetration Test IDE
2 # Copyright (C) 2018 Infobyte LLC (http://www.infobytesec.com/)
3 # See the file 'doc/LICENSE' for the license information
44 ###
5
6 # I'm Py3
99
1010 class ApiError(Exception):
1111 def __init__(self, message):
12 super(ApiError, self).__init__(message)
12 super().__init__(message)
1313
1414
1515 class Structure:
11 # -*- coding: utf-8 -*-
22
33 ###
4 ## Faraday Penetration Test IDE
5 ## Copyright (C) 2018 Infobyte LLC (http://www.infobytesec.com/)
6 ## See the file 'doc/LICENSE' for the license information
4 # Faraday Penetration Test IDE
5 # Copyright (C) 2018 Infobyte LLC (http://www.infobytesec.com/)
6 # See the file 'doc/LICENSE' for the license information
77 ###
88 from builtins import str
99
3131
3232 threshold = 0.75
3333 min_weight = 0.3
34
3435
3536 def compare(a, b):
3637 return SequenceMatcher(None, a, b).ratio()
691692 else:
692693 self.api.set_array(field, value, add=to_add, key=key, object=vuln)
693694 action = 'Adding %s to %s list in vulnerability %s with id %s' % (
694 value, key, vuln.name, vuln.id)
695 value, key, vuln.name, vuln.id)
695696 if not to_add:
696697 action = 'Removing %s from %s list in vulnerability %s with id %s' % (
697698 value, key, vuln.name, vuln.id)
829830 smtp_ssl=ssl
830831 )
831832
832 for d in [output, 'log/']: # TODO CHANGE THIS
833 for d in [output, 'log/']: # TODO CHANGE THIS
833834 if not Path(d):
834835 Path(d).mkdir(parents=True)
835836
11 # -*- coding: utf-8 -*-
22
33 ###
4 ## Faraday Penetration Test IDE
5 ## Copyright (C) 2018 Infobyte LLC (http://www.infobytesec.com/)
6 ## See the file 'doc/LICENSE' for the license information
4 # Faraday Penetration Test IDE
5 # Copyright (C) 2018 Infobyte LLC (http://www.infobytesec.com/)
6 # See the file 'doc/LICENSE' for the license information
77 ###
88 import re
99 import json
00 # Faraday Penetration Test IDE
11 # Copyright (C) 2016 Infobyte LLC (http://www.infobytesec.com/)
22 # See the file 'doc/LICENSE' for the license information
3 # I'm Py3
1212 import sqlalchemy
1313 import datetime
1414 from collections import defaultdict
15 from flask import g
1615 from flask_classful import FlaskView
1716 from sqlalchemy.orm import joinedload, undefer
1817 from sqlalchemy.orm.exc import NoResultFound, ObjectDeletedError
2625 from webargs.flaskparser import FlaskParser
2726 from webargs.core import ValidationError
2827 from flask_classful import route
28 import flask_login
2929
3030 from faraday.server.models import Workspace, db, Command, CommandObject, count_vulnerability_severities
3131 from faraday.server.schemas import NullToBlankString
3232 from faraday.server.utils.database import (
3333 get_conflict_object,
3434 is_unique_constraint_violation
35 )
35 )
3636 from faraday.server.utils.filters import FlaskRestlessSchema
3737 from faraday.server.utils.search import search
3838
299299 deserialization
300300 """
301301 return FlaskParser(unknown=EXCLUDE).parse(schema, request, location="json",
302 *args, **kwargs)
302 *args, **kwargs)
303303
304304 @classmethod
305305 def register(cls, app, *args, **kwargs):
306306 """Register and add JSON error handler. Use error code
307307 400 instead of 409"""
308 super(GenericView, cls).register(app, *args, **kwargs)
308 super().register(app, *args, **kwargs)
309309
310310 @app.errorhandler(422)
311 def handle_error(err): # pylint: disable=unused-variable
311 def handle_error(err): # pylint: disable=unused-variable
312312 # webargs attaches additional metadata to the `data` attribute
313313 exc = getattr(err, 'exc')
314314 if exc:
321321 }), 400
322322
323323 @app.errorhandler(409)
324 def handle_conflict(err): # pylint: disable=unused-variable
324 def handle_conflict(err): # pylint: disable=unused-variable
325325 # webargs attaches additional metadata to the `data` attribute
326326 exc = getattr(err, 'exc', None) or getattr(err, 'description', None)
327327 if exc:
332332 return flask.jsonify(messages), 409
333333
334334 @app.errorhandler(InvalidUsage)
335 def handle_invalid_usage(error): # pylint: disable=unused-variable
335 def handle_invalid_usage(error): # pylint: disable=unused-variable
336336 response = flask.jsonify(error.to_dict())
337337 response.status_code = error.status_code
338338 return response
339339
340340 # @app.errorhandler(404)
341 def handle_not_found(err): # pylint: disable=unused-variable
341 def handle_not_found(err): # pylint: disable=unused-variable
342342 response = {'success': False, 'message': err.description if faraday_server.debug else err.name}
343343 return flask.jsonify(response), 404
344344
345345 @app.errorhandler(500)
346 def handle_server_error(err): # pylint: disable=unused-variable
347 response = {'success': False, 'message': f"Exception: {err.original_exception}" if faraday_server.debug else 'Internal Server Error'}
346 def handle_server_error(err): # pylint: disable=unused-variable
347 response = {'success': False,
348 'message': f"Exception: {err.original_exception}" if faraday_server.debug else 'Internal Server Error'}
348349 return flask.jsonify(response), 500
349350
350351
398399 sup = super()
399400 if hasattr(sup, 'before_request'):
400401 sup.before_request(name, *args, **kwargs)
401 if (self._get_workspace(kwargs['workspace_name']).readonly and
402 flask.request.method not in ['GET', 'HEAD', 'OPTIONS']):
402 if (self._get_workspace(kwargs['workspace_name']).readonly
403 and flask.request.method not in ['GET', 'HEAD', 'OPTIONS']):
403404 flask.abort(403, "Altering a readonly workspace is not allowed")
404405
405406
583584
584585 try:
585586 per_page = int(flask.request.args[
586 self.per_page_parameter_name])
587 self.per_page_parameter_name])
587588 except (TypeError, ValueError):
588589 flask.abort(404, 'Invalid per_page value')
589590
609610 class FilterWorkspacedMixin(ListMixin):
610611 """Add filter endpoint for searching on any workspaced objects columns
611612 """
613
612614 @route('/filter')
613615 def filter(self, workspace_name):
614616 """
633635
634636 class PageMeta:
635637 total = 0
638
636639 pagination_metadata = PageMeta()
637640 pagination_metadata.total = count
638641 return self._envelope_list(filtered_objs, pagination_metadata)
639642
640643 def _generate_filter_query(self, filters, workspace, severity_count=False):
641644 filter_query = search(db.session,
642 self.model_class,
643 filters)
645 self.model_class,
646 filters)
644647
645648 filter_query = filter_query.filter(self.model_class.workspace == workspace)
646649
665668 if 'offset' in filters:
666669 offset = filters.pop('offset')
667670 if 'limit' in filters:
668 limit = filters.pop('limit') # we need to remove pagination, since
669
670 filter_query = self._generate_filter_query(
671 filters,
672 workspace,
673 severity_count=severity_count
674 )
671 limit = filters.pop('limit') # we need to remove pagination, since
672
673 try:
674 filter_query = self._generate_filter_query(
675 filters,
676 workspace,
677 severity_count=severity_count
678 )
679 except AttributeError as e:
680 flask.abort(400, e)
681
675682 count = filter_query.count()
676683 if limit:
677684 filter_query = filter_query.limit(limit)
680687 objs = self.schema_class(**marshmallow_params).dumps(filter_query.all())
681688 return json.loads(objs), count
682689 else:
683 filter_query = self._generate_filter_query(
684 filters,
685 workspace,
686 )
690 try:
691 filter_query = self._generate_filter_query(
692 filters,
693 workspace,
694 )
695 except AttributeError as e:
696 flask.abort(400, e)
687697 column_names = ['count'] + [field['field'] for field in filters.get('group_by', [])]
688698 rows = [list(zip(column_names, row)) for row in filter_query.all()]
689699 data = []
721731
722732 class PageMeta:
723733 total = 0
734
724735 pagination_metadata = PageMeta()
725736 pagination_metadata.total = count
726737 return self._envelope_list(filtered_objs, pagination_metadata)
727738
728739 def _generate_filter_query(self, filters, severity_count=False, host_vulns=False):
729740 filter_query = search(db.session,
730 self.model_class,
731 filters)
741 self.model_class,
742 filters)
732743
733744 if severity_count and 'group_by' not in filters:
734745 filter_query = count_vulnerability_severities(filter_query, self.model_class,
751762 if 'offset' in filters:
752763 offset = filters.pop('offset')
753764 if 'limit' in filters:
754 limit = filters.pop('limit') # we need to remove pagination, since
755
756 filter_query = self._generate_filter_query(
757 filters,
758 severity_count=severity_count,
759 host_vulns=host_vulns
760 )
765 limit = filters.pop('limit') # we need to remove pagination, since
766
767 try:
768 filter_query = self._generate_filter_query(
769 filters,
770 severity_count=severity_count,
771 host_vulns=host_vulns
772 )
773 except AttributeError as e:
774 flask.abort(400, e)
761775
762776 if extra_alchemy_filters is not None:
763777 filter_query = filter_query.filter(extra_alchemy_filters)
817831
818832 class RetrieveWorkspacedMixin(RetrieveMixin):
819833 """Add GET /<workspace_name>/<route_base>/<id>/ route"""
834
820835 # There are no differences with the non-workspaced implementations. The code
821836 # inside the view generic methods is enough
822837 def get(self, object_id, workspace_name=None):
911926 flask.request)
912927 data.pop('id', None)
913928 created = self._perform_create(data, **kwargs)
914 created.creator = g.user
929 if not flask_login.current_user.is_anonymous:
930 created.creator = flask_login.current_user
915931 db.session.commit()
916932 return self._dump(created, kwargs), 201
917933
960976 command_id = None
961977
962978 if command_id:
963 command = db.session.query(Command).filter(Command.id==command_id, Command.workspace==obj.workspace).first()
979 command = db.session.query(Command).filter(Command.id == command_id,
980 Command.workspace == obj.workspace).first()
964981 if command is None:
965982 raise InvalidUsage('Command not found.')
966983 # if the object is created and updated in the same command
11781195 self._perform_update(object_id, obj, data, partial=True, **kwargs)
11791196
11801197 return self._dump(obj, kwargs), 200
1198
11811199
11821200 class UpdateWorkspacedMixin(UpdateMixin, CommandMixin):
11831201 """Add PUT /<workspace_name>/<route_base>/<id>/ route
12741292
12751293 class DeleteMixin:
12761294 """Add DELETE /<id>/ route"""
1295
12771296 def delete(self, object_id, **kwargs):
12781297 """
12791298 ---
13001319
13011320 class DeleteWorkspacedMixin(DeleteMixin):
13021321 """Add DELETE /<workspace_name>/<route_base>/<id>/ route"""
1322
13031323 def delete(self, object_id, workspace_name=None):
1304
13051324 """
13061325 ---
13071326 tags: ["{tag_name}"]
13851404
13861405 count = self._filter_query(
13871406 db.session.query(self.model_class)
1388 .join(Workspace)
1389 .group_by(group_by)
1390 .filter(Workspace.name == workspace_name,
1391 *self.count_extra_filters))
1392
1393 #order
1407 .join(Workspace)
1408 .group_by(group_by)
1409 .filter(Workspace.name == workspace_name,
1410 *self.count_extra_filters))
1411
1412 # order
13941413 order_by = group_by
13951414 if sort_dir == 'desc':
13961415 count = count.order_by(desc(order_by))
14421461 400:
14431462 description: No workspace passed or group_by is not specified
14441463 """
1445 #"""head:
1464 # """head:
14461465 # tags: [{tag_name}]
14471466 # responses:
14481467 # 200:
14861505 grouped_attr = getattr(self.model_class, group_by)
14871506
14881507 q = db.session.query(
1489 Workspace.name,
1490 grouped_attr,
1491 func.count(grouped_attr)
1492 )\
1493 .join(Workspace)\
1494 .group_by(grouped_attr, Workspace.name)\
1508 Workspace.name,
1509 grouped_attr,
1510 func.count(grouped_attr)
1511 ) \
1512 .join(Workspace) \
1513 .group_by(grouped_attr, Workspace.name) \
14951514 .filter(Workspace.name.in_(workspace_names_list))
14961515
1497 #order
1516 # order
14981517 order_by = grouped_attr
14991518 if sort_dir == 'desc':
15001519 q = q.order_by(desc(Workspace.name), desc(order_by))
15401559 Model converter that automatically sets minimum length
15411560 validators to not blankable fields
15421561 """
1562
15431563 def _add_column_kwargs(self, kwargs, column):
15441564 super()._add_column_kwargs(kwargs, column)
15451565 if not column.info.get('allow_blank', True):
15661586 else:
15671587 dt = dt.astimezone(datetime.timezone.utc)
15681588 return dt.isoformat(*args, **kwargs)
1589
1590
15691591 fields.DateTime.SERIALIZATION_FUNCS['iso'] = old_isoformat
15701592
15711593
15851607 def __init__(self, *args, **kwargs):
15861608 super().__init__(*args, **kwargs)
15871609 self.unknown = EXCLUDE
1610
15881611
15891612 class FilterAlchemyModelConverter(ModelConverter):
15901613 """Use this to make all fields of a model not required.
33 See the file 'doc/LICENSE' for the license information
44
55 """
6
7 # I'm Py3
3333 agent_creation_api = Blueprint('agent_creation_api', __name__)
3434
3535 logger = logging.getLogger(__name__)
36
3637
3738 class ExecutorSchema(AutoSchema):
3839
132133 except NoResultFound:
133134 flask.abort(404, f"No such workspace: {workspace_name}")
134135
135 def _perform_create(self, data, **kwargs):
136 def _perform_create(self, data, **kwargs):
136137 token = data.pop('token')
137138 if not faraday_server.agent_registration_secret:
138139 # someone is trying to use the token, but no token was generated yet.
139140 abort(401, "Invalid Token")
140 if not pyotp.TOTP(faraday_server.agent_registration_secret).verify(token, valid_window=1):
141 if not pyotp.TOTP(faraday_server.agent_registration_secret,
142 interval=int(faraday_server.agent_token_expiration)
143 ).verify(token, valid_window=1):
141144 abort(401, "Invalid Token")
142145
143146 workspace_names = data.pop('workspaces')
160163 dict_["name"] for dict_ in workspace_names
161164 ]
162165
163
164166 workspaces = list(
165167 self._get_workspace(workspace_name)
166168 for workspace_name in workspace_names
1616 class AgentAuthTokenSchema(Schema):
1717 token = fields.String(required=True)
1818 expires_in = fields.Float(required=True)
19 total_duration = fields.Float(required=True)
1920
2021
2122 class AgentAuthTokenView(GenericView):
3940 200:
4041 description: Ok
4142 """
42 totp = pyotp.TOTP(faraday_server.agent_registration_secret)
43 totp = pyotp.TOTP(faraday_server.agent_registration_secret, interval=int(
44 faraday_server.agent_token_expiration))
4345 return AgentAuthTokenSchema().dump(
4446 {'token': totp.now(),
45 'expires_in': totp.interval - datetime.datetime.now().timestamp() % totp.interval})
47 'expires_in': totp.interval - datetime.datetime.now().timestamp() % totp.interval,
48 'total_duration': totp.interval})
4649
4750
4851 class AgentAuthTokenV3View(AgentAuthTokenView):
4952 route_prefix = '/v3'
5053 trailing_slash = False
5154
55
5256 AgentAuthTokenView.register(agent_auth_token_api)
5357 AgentAuthTokenV3View.register(agent_auth_token_api)
44 """
55 from __future__ import print_function
66 from __future__ import absolute_import
7 from builtins import range
87
98 import flask
10 from flask_login import current_user
11 from marshmallow import Schema, fields
129
1310 from werkzeug.local import LocalProxy
1411 from werkzeug.datastructures import MultiDict
1714 import logging
1815
1916 from flask import current_app as app
20 from flask import abort, Blueprint, jsonify, g, request, make_response
21 from flask_security.confirmable import requires_confirmation
22 from flask_security.forms import LoginForm, ChangePasswordForm
23 from flask_security.datastore import SQLAlchemyUserDatastore
24 from flask_security.utils import (
25 get_message,
26 get_identity_attributes,
27 )
28 from flask_security.signals import password_reset, reset_password_instructions_sent
17 from flask import Blueprint, request, make_response
18 from flask_security.signals import reset_password_instructions_sent
2919 from faraday.server import config
3020
3121 from flask_security.recoverable import generate_reset_password_token, update_password
3222 from flask_security.views import anonymous_user_required
33 from werkzeug.middleware.proxy_fix import ProxyFix
34 #from flask_security.recoverable import _security
35 from flask_security.utils import do_flash, send_mail, \
36 config_value, get_token_status, verify_hash
23 from flask_security.utils import send_mail, config_value, get_token_status, verify_hash
3724 from flask_security.forms import ResetPasswordForm
3825
3926 from faraday.server.models import User
27
4028 _security = LocalProxy(lambda: app.extensions['security'])
4129 _datastore = LocalProxy(lambda: _security.datastore)
4230
4432 logger = logging.getLogger(__name__)
4533
4634
47 @auth.route('/auth/forgot_password', methods= ['POST'])
35 @auth.route('/auth/forgot_password', methods=['POST'])
4836 @anonymous_user_required
4937 def forgot_password():
5038 """
5947
6048 if not config.smtp.is_enabled():
6149 logger.warning('Missing SMTP Config.')
62 return make_response(flask.jsonify(response=dict(message="Operation not implemented"), success=False, code=501), 501)
50 return make_response(flask.jsonify(response=dict(message="Operation not implemented"), success=False, code=501),
51 501)
6352
6453 if 'email' not in request.json:
65 return make_response(flask.jsonify(response=dict(message="Operation not allowed"), success=False, code=406),406)
66
54 return make_response(flask.jsonify(response=dict(message="Operation not allowed"), success=False, code=406),
55 406)
6756
6857 try:
6958 email = request.json.get('email')
7059 user = User.query.filter_by(email=email).first()
7160 if not user:
72 return make_response(flask.jsonify(response=dict(email=email, message="Invalid Email"), success=False, code=400), 400)
61 return make_response(
62 flask.jsonify(response=dict(email=email, message="Invalid Email"), success=False, code=400), 400)
7363
7464 send_reset_password_instructions(user)
7565 return flask.jsonify(response=dict(email=email), success=True, code=200)
7666 except Exception as e:
7767 logger.exception(e)
78 return make_response(flask.jsonify(response=dict(email=email, message="Server Error"), success=False, code=500), 500)
68 return make_response(flask.jsonify(response=dict(email=email, message="Server Error"), success=False, code=500),
69 500)
7970
8071
81 @auth.route('/auth/reset_password/<token>', methods= ['POST'])
72 @auth.route('/auth/reset_password/<token>', methods=['POST'])
8273 @anonymous_user_required
8374 def reset_password(token):
8475 """
9283 """
9384 if not config.smtp.is_enabled():
9485 logger.warning('Missing SMTP Config.')
95 return make_response(flask.jsonify(response=dict(message="Operation not implemented"), success=False, code=501), 501)
86 return make_response(flask.jsonify(response=dict(message="Operation not implemented"), success=False, code=501),
87 501)
9688
9789 try:
9890 if 'password' not in request.json or 'password_confirm' not in request.json:
99 return make_response(flask.jsonify(response=dict(message="Invalid data provided"), success=False, code=406),406)
91 return make_response(flask.jsonify(response=dict(message="Invalid data provided"), success=False, code=406),
92 406)
10093
10194 expired, invalid, user = reset_password_token_status(token)
10295
10497 invalid = True
10598
10699 if invalid or expired:
107 return make_response(flask.jsonify(response=dict(message="Invalid Token"), success=False, code=406),406)
100 return make_response(flask.jsonify(response=dict(message="Invalid Token"), success=False, code=406), 406)
108101 if request.is_json:
109102 form = ResetPasswordForm(MultiDict(request.get_json()))
110103 if form.validate_on_submit() and validate_strong_password(form.password.data, form.password_confirm.data):
112105 _datastore.commit()
113106 return flask.jsonify(response=dict(message="Password changed successfully"), success=True, code=200)
114107
115 return make_response(flask.jsonify(response=dict(message="Bad request"), success=False, code=400),400)
108 return make_response(flask.jsonify(response=dict(message="Bad request"), success=False, code=400), 400)
116109
117110 except Exception as e:
118111 logger.exception(e)
119 return make_response(flask.jsonify(response=dict(token=token, message="Server Error"), success=False, code=500), 500)
112 return make_response(flask.jsonify(response=dict(token=token, message="Server Error"), success=False, code=500),
113 500)
120114
121115
122116 def send_reset_password_instructions(user):
130124
131125 if config_value('SEND_PASSWORD_RESET_EMAIL'):
132126 send_mail(config_value('EMAIL_SUBJECT_PASSWORD_RESET'),
133 user.email, 'reset_instructions',
134 user=user, reset_link=reset_link)
127 user.email, 'reset_instructions',
128 user=user, reset_link=reset_link)
135129
136130 reset_password_instructions_sent.send(
137131 app._get_current_object(), user=user, token=token
144138 """
145139 if config_value('SEND_PASSWORD_RESET_NOTICE_EMAIL'):
146140 send_mail(config_value('EMAIL_SUBJECT_PASSWORD_NOTICE'),
147 user.email, 'reset_notice', user=user)
141 user.email, 'reset_notice', user=user)
148142
149143
150144 def reset_password_token_status(token):
11 from datetime import datetime, timedelta
22 from typing import Type, Optional
33
4
4 import time
5 import flask_login
56 import flask
67 import sqlalchemy
78 from sqlalchemy.orm.exc import NoResultFound
6768 class PolymorphicVulnerabilityField(fields.Field):
6869 """Used like a nested field with many objects, but it decides which
6970 schema to use based on the type of each vuln"""
71
7072 def __init__(self, *args, **kwargs):
7173 super().__init__(*args, **kwargs)
7274 self.many = kwargs.get('many', False)
222224 data: dict,
223225 data_already_deserialized: bool = False,
224226 set_end_date: bool = True):
227
228 logger.info("Init bulk create process")
229 start_time = time.time()
230
225231 if not data_already_deserialized:
226232 schema = BulkCreateSchema()
227233 data = schema.load(data)
234
228235 if 'command' in data:
229236 command = _update_command(command, data['command'])
230 for host in data['hosts']:
231 _create_host(ws, host, command)
237
238 total_hosts = len(data['hosts'])
239 if total_hosts > 0:
240 logger.debug(f"Needs to create {total_hosts} hosts...")
241 for host in data['hosts']:
242 _create_host(ws, host, command)
243
232244 if 'command' in data and set_end_date:
233245 command.end_date = datetime.now() if command.end_date is None else \
234246 command.end_date
235247 db.session.commit()
248
249 total_secs = time.time() - start_time
250 logger.info(f"Finish bulk create process. Total time: {total_secs:.2f} secs")
236251
237252
238253 def _update_command(command: Command, command_data: dict):
254269 if command is not None:
255270 _create_command_object_for(ws, created, host, command)
256271
257 for service_data in services:
258 _create_service(ws, host, service_data, command)
259
260 for vuln_data in vulns:
261 _create_hostvuln(ws, host, vuln_data, command)
262
263 for cred_data in credentials:
264 _create_credential(ws, cred_data, command, host=host)
272 total_services = len(services)
273 if total_services > 0:
274 logger.debug(f"Needs to create {total_services} services...")
275 for service_data in services:
276 _create_service(ws, host, service_data, command)
277
278 total_vulns = len(vulns)
279 if total_vulns > 0:
280 logger.debug(f"Needs to create {total_vulns} vulns...")
281 for vuln_data in vulns:
282 _create_hostvuln(ws, host, vuln_data, command)
283
284 total_credentials = len(credentials)
285 if total_credentials > 0:
286 logger.debug(f"Needs to create {total_credentials} credentials...")
287 for cred_data in credentials:
288 _create_credential(ws, cred_data, command, host=host)
265289
266290
267291 def _create_command_object_for(ws, created, obj, command):
310334 if command is not None:
311335 _create_command_object_for(ws, created, service, command)
312336
313 for vuln_data in vulns:
314 _create_servicevuln(ws, service, vuln_data, command)
315
316 for cred_data in creds:
317 _create_credential(ws, cred_data, command, service=service)
337 total_service_vulns = len(vulns)
338 if total_service_vulns > 0:
339 logger.debug(f"Needs to create {total_service_vulns} service vulns...")
340 for vuln_data in vulns:
341 _create_servicevuln(ws, service, vuln_data, command)
342
343 total_service_creds = len(creds)
344 if total_service_creds > 0:
345 logger.debug(f"Needs to create {total_service_creds} service credentials...")
346 for cred_data in creds:
347 _create_credential(ws, cred_data, command, service=service)
318348
319349
320350 def _create_vuln(ws, vuln_data, command=None, **kwargs):
432462 """
433463 data = self._parse_data(self._get_schema_instance({}), flask.request)
434464
435 if flask.g.user is None:
465 if flask_login.current_user.is_anonymous:
436466 agent = require_agent_token()
437467 workspace = self._get_workspace(workspace_name)
438468
472502
473503 data["command"] = {
474504 'id': agent_execution.command.id,
475 'tool': agent.name, # Agent name
505 'tool': agent.name, # Agent name
476506 'command': agent_execution.executor.name,
477507 'user': '',
478508 'hostname': '',
494524 _update_command(command, data['command'])
495525 db.session.flush()
496526
497
498527 else:
499528 workspace = self._get_workspace(workspace_name)
500 creator_user = flask.g.user
529 creator_user = flask_login.current_user
501530 data = add_creator(data, creator_user)
502531
503532 if 'command' in data:
4040 if obj.end_date:
4141 return (obj.end_date - obj.start_date).seconds + ((obj.end_date - obj.start_date).microseconds / 1000000.0)
4242 else:
43 if (datetime.datetime.now() - obj.start_date).total_seconds() > 86400:# 86400 is 1d TODO BY CONFIG
43 if (datetime.datetime.now() - obj.start_date).total_seconds() > 86400: # 86400 is 1d TODO BY CONFIG
4444 return 'Timeout'
4545 return 'In progress'
4646
119119 200:
120120 description: Last executed command or an empty json
121121 """
122 command = Command.query.join(Workspace).filter_by(name=workspace_name).order_by(Command.start_date.desc()).first()
122 command = Command.query.join(Workspace).filter_by(name=workspace_name).order_by(
123 Command.start_date.desc()).first()
123124 command_obj = {}
124125 if command:
125126 command_obj = {
0
10 import logging
21 from io import BytesIO
3 from lxml.etree import Element, SubElement, tostring # nosec
2 from lxml.etree import Element, SubElement, tostring # nosec
43 # We don't use Element for parsing
54 from flask import Blueprint, request, abort, send_file
65
8180 web_services.add(vuln_web.service)
8281 web_vuln_tag = SubElement(web_vulns_tag, 'web_vuln')
8382 _build_vuln_web_element(vuln_web, web_vuln_tag)
84
8583
8684 for vuln in host.vulnerabilities:
8785 vuln_tag = SubElement(vulns_tag, 'vuln')
1010 exploits_api = Blueprint('exploits_api', __name__)
1111
1212 logger = logging.getLogger(__name__)
13
1314
1415 @gzipped
1516 @exploits_api.route('/v2/vulners/exploits/<cveid>', methods=['GET'])
1414 'messages': 'error',
1515 }), 500
1616
17
18
19
20 #.register(commandsrun_api)
21 # I'm Py3
17 # .register(commandsrun_api)
132132 operators = (operators.Equal, operators.Like, operators.ILike)
133133 service = ServiceNameFilter(fields.Str())
134134 port = ServicePortFilter(fields.Str())
135
136135
137136
138137 class HostsView(PaginatedMixin,
378377 Service.name.ilike(like_term))
379378 match_os = Host.os.ilike(like_term)
380379 match_hostname = Host.hostnames.any(Hostname.name.ilike(like_term))
381 query = query.filter(match_ip |
382 match_service_name |
383 match_os |
384 match_hostname)
380 query = query.filter(match_ip
381 | match_service_name
382 | match_os
383 | match_hostname)
385384 return query
386385
387386 def _envelope_list(self, objects, pagination_metadata=None):
464463 bulk_create.__doc__ = HostsView.bulk_create.__doc__
465464 count_vulns.__doc__ = HostsView.count_vulns.__doc__
466465
466
467467 HostsView.register(host_api)
468468 HostsV3View.register(host_api)
99
1010
1111 info_api = Blueprint('info_api', __name__)
12
1213
1314 @info_api.route('/v2/info', methods=['GET'])
1415 def show_info():
00 from faraday.server.api.base import GenericView
11 from faraday.server.models import User, db
2 from flask import Blueprint, request, jsonify, g, abort
2 from flask import Blueprint, request, jsonify, abort
33 from marshmallow import Schema, fields
4 import flask_login
45
56 preferences_api = Blueprint('preferences_api', __name__)
67
2425 200:
2526 description: Ok
2627 """
27 user = g.user
28 user = flask_login.current_user
2829
2930 if request.json and 'preferences' not in request.json:
3031 abort(400)
4647 200:
4748 description: Ok
4849 """
49 return jsonify({'preferences': g.user.preferences}), 200
50 return jsonify({'preferences': flask_login.current_user.preferences}), 200
5051
5152
5253 class PreferencesV3View(PreferencesView):
00 # Faraday Penetration Test IDE
11 # Copyright (C) 2016 Infobyte LLC (http://www.infobytesec.com/)
22 # See the file 'doc/LICENSE' for the license information
3 from flask import Blueprint, g
3 from flask import Blueprint
44 from marshmallow import fields
5 import flask_login
56
67 from faraday.server.models import SearchFilter
78 from faraday.server.api.base import (
3031
3132 def _get_base_query(self):
3233 query = super()._get_base_query()
33 return query.filter(SearchFilter.creator_id == g.user.id)
34 return query.filter(SearchFilter.creator_id == flask_login.current_user.id)
3435
3536
3637 class SearchFilterV3View(SearchFilterView, PatchableMixin):
138138 route_prefix = '/v3/ws/<workspace_name>/'
139139 trailing_slash = False
140140
141
141142 ServiceView.register(services_api)
142143 ServiceV3View.register(services_api)
66 from flask import jsonify, Blueprint
77 from flask_wtf.csrf import generate_csrf
88 from faraday.server.api.base import get_user_permissions
9 import flask
9 import flask_login
1010
1111 session_api = Blueprint('session_api', __name__)
12
1213
1314 @session_api.route('/session')
1415 def session_info():
2122 200:
2223 description: Ok
2324 """
24 user = flask.g.user
25 user = flask_login.current_user
2526 data = user.get_security_payload()
2627 data['csrf_token'] = generate_csrf()
2728 data['preferences'] = user.preferences
11 import logging
22
33 from itsdangerous import TimedJSONWebSignatureSerializer
4 from flask import Blueprint, g, request
4 from flask import Blueprint, request
55 from flask_security.utils import hash_data
66 from flask import current_app as app
77 from marshmallow import Schema
8 import flask_login
89
910 from faraday.server.config import faraday_server
1011 from faraday.server.api.base import GenericView
3233 200:
3334 description: Ok
3435 """
35 user_id = g.user.id
36 user_id = flask_login.current_user.fs_uniquifier
3637 serializer = TimedJSONWebSignatureSerializer(
3738 app.config['SECRET_KEY'],
3839 salt="api_token",
3940 expires_in=int(faraday_server.api_token_expiration)
4041 )
41 hashed_data = hash_data(g.user.password) if g.user.password else None
42 hashed_data = hash_data(flask_login.current_user.password) if flask_login.current_user.password else None
4243 user_ip = request.headers.get('X-Forwarded-For', request.remote_addr)
4344 requested_at = datetime.datetime.now()
44 audit_logger.info(f"User [{g.user.username}] requested token from IP [{user_ip}] at [{requested_at}]")
45 audit_logger.info(f"User [{flask_login.current_user.username}] requested token from IP [{user_ip}] at [{requested_at}]")
4546 return serializer.dumps({'user_id': user_id, "validation_check": hashed_data}).decode('utf-8')
4647
4748
44 import random
55 import logging
66 from datetime import datetime
7 import flask_login
78
89 from faraday.server.config import CONST_FARADAY_HOME_PATH
910 from faraday.server.threads.reports_processor import REPORTS_QUEUE
1415 jsonify,
1516 Blueprint,
1617 )
17 import flask
1818
1919 from flask_wtf.csrf import validate_csrf
2020 from werkzeug.utils import secure_filename
7474 if report_file:
7575
7676 chars = string.ascii_uppercase + string.digits
77 random_prefix = ''.join(random.choice(chars) for x in range(12)) # nosec
77 random_prefix = ''.join(random.choice(chars) for x in range(12)) # nosec
7878 raw_report_filename = f'{random_prefix}_{secure_filename(report_file.filename)}'
7979
8080 try:
8585 except AttributeError:
8686 logger.warning(
8787 "Upload reports in WEB-UI not configurated, run Faraday client and try again...")
88 abort(make_response(jsonify(message="Upload reports not configurated: Run faraday client and start Faraday server again"), 500))
88 abort(make_response(
89 jsonify(message="Upload reports not configurated: Run faraday client and start Faraday server again"),
90 500))
8991 else:
9092 logger.info(f"Get plugin for file: {file_path}")
9193 plugin = report_analyzer.get_plugin(file_path)
115117 command.id,
116118 file_path,
117119 plugin.id,
118 flask.g.user.id
120 flask_login.current_user.id
119121 )
120122 )
121123 return make_response(
2020 from marshmallow.validate import OneOf
2121 import wtforms
2222
23
2423 from faraday.server.api.base import (
2524 AutoSchema,
2625 FilterAlchemyMixin,
6059 _id = fields.Integer(dump_only=True, attribute='id')
6160 id = fields.Integer(dump_only=True, attribute='id')
6261 _rev = fields.String(default='', dump_only=True)
63 cwe = fields.String(dump_only=True, default='') # deprecated field, the legacy data is added to refs on import
62 cwe = fields.String(dump_only=True, default='') # deprecated field, the legacy data is added to refs on import
6463 exploitation = SeverityField(attribute='severity', required=True)
6564 references = fields.Method('get_references', deserialize='load_references')
6665 refs = fields.List(fields.String(), dump_only=True, attribute='references')
7776 creator_id = fields.Integer(dump_only=True, attribute='creator_id')
7877
7978 create_at = fields.DateTime(attribute='create_date',
80 dump_only=True)
79 dump_only=True)
8180
8281 # Here we use vulnerability instead of vulnerability_template to avoid duplicate row
8382 # in the custom_fields_schema table.
284283 status_code
285284 )
286285
287
288286 def _parse_vuln_from_file(self, vulns_reader):
289287 custom_fields = {cf_schema.field_name: cf_schema for cf_schema in db.session.query(CustomFieldsSchema).all()}
290288 vulns_list = []
123123 policyviolations = fields.List(fields.String,
124124 attribute='policy_violations')
125125 refs = fields.List(fields.String(), attribute='references')
126 owasp = fields.Method(serialize='get_owasp_refs', default=[])
127 cve = fields.Method(serialize='get_cve_refs', default=[])
128 cwe = fields.Method(serialize='get_cwe_refs', default=[])
129 cvss = fields.Method(serialize='get_cvss_refs', default=[])
126130 issuetracker = fields.Method(serialize='get_issuetracker', dump_only=True)
127131 tool = fields.String(attribute='tool')
128132 parent = fields.Method(serialize='get_parent', deserialize='load_parent', required=True)
168172 'service', 'obj_id', 'type', 'policyviolations',
169173 '_attachments',
170174 'target', 'host_os', 'resolution', 'metadata',
171 'custom_fields', 'external_id', 'tool')
175 'custom_fields', 'external_id', 'tool',
176 'cvss', 'cwe', 'cve', 'owasp',
177 )
172178
173179 def get_type(self, obj):
174180 return obj.__class__.__name__
181
182 def get_owasp_refs(self, obj):
183 return [reference for reference in obj.references if 'owasp' in reference.lower()]
184
185 def get_cwe_refs(self, obj):
186 return [reference for reference in obj.references if 'cwe' in reference.lower()]
187
188 def get_cve_refs(self, obj):
189 return [reference for reference in obj.references if 'cve' in reference.lower()]
190
191 def get_cvss_refs(self, obj):
192 return [reference for reference in obj.references if 'cvss' in reference.lower()]
175193
176194 def get_attachments(self, obj):
177195 res = {}
266284
267285
268286 class VulnerabilityWebSchema(VulnerabilitySchema):
269
270287 method = fields.String(default='')
271288 params = fields.String(attribute='parameters', default='')
272289 pname = fields.String(attribute='parameter_name', default='')
289306 'service', 'obj_id', 'type', 'policyviolations',
290307 'request', '_attachments', 'params',
291308 'target', 'host_os', 'resolution', 'method', 'metadata',
292 'status_code', 'custom_fields', 'external_id', 'tool'
309 'status_code', 'custom_fields', 'external_id', 'tool',
310 'cve', 'cwe', 'owasp', 'cvss',
293311 )
294312
295313
335353 return query.join(
336354 alias,
337355 alias.id == model.__table__.c.service_id).filter(
338 alias.name == value
356 alias.name == value
339357 )
340358
341359
345363
346364 value_list = value.split(",")
347365
348 service_hostnames_query = query.join(Service, Service.id == Vulnerability.service_id).\
349 join(Host).\
350 join(alias).\
351 filter(alias.name.in_(value_list))
352
353 host_hostnames_query = query.join(Host, Host.id == Vulnerability.host_id).\
354 join(alias).\
366 service_hostnames_query = query.join(Service, Service.id == Vulnerability.service_id). \
367 join(Host). \
368 join(alias). \
369 filter(alias.name.in_(value_list))
370
371 host_hostnames_query = query.join(Host, Host.id == Vulnerability.host_id). \
372 join(alias). \
355373 filter(alias.name.in_(value_list))
356374
357375 query = service_hostnames_query.union(host_hostnames_query)
395413 field: _strict_filtering for field in strict_fields
396414 }
397415 operators = (CustomILike, operators.Equal)
416
398417 id = IDFilter(fields.Int())
399418 target = TargetFilter(fields.Str())
400419 type = TypeFilter(fields.Str(validate=[OneOf(['Vulnerability',
442461
443462 if command_id:
444463 # query = query.filter(CommandObject.command_id == int(command_id))
445 query = query.filter(VulnerabilityGeneric.creator_command_id ==
446 int(command_id)) # TODO migration: handle invalid int()
464 query = query.filter(VulnerabilityGeneric.creator_command_id
465 == int(command_id)) # TODO migration: handle invalid int()
447466 return query
448467
449468
451470 FilterAlchemyMixin,
452471 ReadWriteWorkspacedView,
453472 CountMultiWorkspacedMixin):
454
455473 route_base = 'vulns'
456474 filterset_class = VulnerabilityFilterSet
457475 sort_model_class = VulnerabilityWeb # It has all the fields
534552 )
535553
536554 def _update_object(self, obj, data, **kwargs):
537 data.pop('type', '') # It's forbidden to change vuln type!
555 data.pop('type', '') # It's forbidden to change vuln type!
538556 data.pop('tool', '')
539557 return super()._update_object(obj, data)
540558
557575 *args, **kwargs)
558576 joinedloads = [
559577 joinedload(Vulnerability.host)
560 .load_only(Host.id) # Only hostnames are needed
561 .joinedload(Host.hostnames),
578 .load_only(Host.id) # Only hostnames are needed
579 .joinedload(Host.hostnames),
562580
563581 joinedload(Vulnerability.service)
564 .joinedload(Service.host)
565 .joinedload(Host.hostnames),
582 .joinedload(Service.host)
583 .joinedload(Host.hostnames),
566584
567585 joinedload(VulnerabilityWeb.service)
568 .joinedload(Service.host)
569 .joinedload(Host.hostnames),
586 .joinedload(Service.host)
587 .joinedload(Host.hostnames),
570588 joinedload(VulnerabilityGeneric.update_user),
571589 undefer(VulnerabilityGeneric.creator_command_id),
572590 undefer(VulnerabilityGeneric.creator_command_tool),
684702 flask.abort(403)
685703 vuln_workspace_check = db.session.query(VulnerabilityGeneric, Workspace.id).join(
686704 Workspace).filter(VulnerabilityGeneric.id == vuln_id,
687 Workspace.name == workspace_name).first()
705 Workspace.name == workspace_name).first()
688706
689707 if vuln_workspace_check:
690708 if 'file' not in request.files:
776794 return res_filters, hostname_filters
777795
778796 def _generate_filter_query(self, vulnerability_class, filters, hostname_filters, workspace, marshmallow_params):
779 hosts_os_filter = [host_os_filter for host_os_filter in filters.get('filters', []) if host_os_filter.get('name') == 'host__os']
797 hosts_os_filter = [host_os_filter for host_os_filter in filters.get('filters', []) if
798 host_os_filter.get('name') == 'host__os']
780799
781800 if hosts_os_filter:
782801 # remove host__os filters from filters due to a bug
783802 hosts_os_filter = hosts_os_filter[0]
784 filters['filters'] = [host_os_filter for host_os_filter in filters.get('filters', []) if host_os_filter.get('name') != 'host__os']
803 filters['filters'] = [host_os_filter for host_os_filter in filters.get('filters', []) if
804 host_os_filter.get('name') != 'host__os']
785805
786806 vulns = search(db.session,
787807 vulnerability_class,
788808 filters)
789 vulns = vulns.filter(VulnerabilityGeneric.workspace==workspace)
790
809 vulns = vulns.filter(VulnerabilityGeneric.workspace == workspace)
791810 if hostname_filters:
792811 or_filters = []
793812 for hostname_filter in hostname_filters:
799818
800819 if hosts_os_filter:
801820 os_value = hosts_os_filter['val']
802 vulns = vulns.join(Host).join(Service).filter(Host.os==os_value)
821 vulns = vulns.join(Host).join(Service).filter(Host.os == os_value)
803822
804823 if 'group_by' not in filters:
805824 vulns = vulns.options(
815834 filters = FlaskRestlessSchema().load(json.loads(filters)) or {}
816835 hostname_filters = []
817836 if filters:
818 _, hostname_filters = self._hostname_filters(filters.get('filters', []))
837 filters['filters'], hostname_filters = self._hostname_filters(filters.get('filters', []))
819838 except (ValidationError, JSONDecodeError) as ex:
820839 logger.exception(ex)
821840 flask.abort(400, "Invalid filters")
828847 if 'offset' in filters:
829848 offset = filters.pop('offset')
830849 if 'limit' in filters:
831 limit = filters.pop('limit') # we need to remove pagination, since
832
833 vulns = self._generate_filter_query(
834 VulnerabilityGeneric,
835 filters,
836 hostname_filters,
837 workspace,
838 marshmallow_params)
850 limit = filters.pop('limit') # we need to remove pagination, since
851
852 try:
853 vulns = self._generate_filter_query(
854 VulnerabilityGeneric,
855 filters,
856 hostname_filters,
857 workspace,
858 marshmallow_params)
859 except AttributeError as e:
860 flask.abort(400, e)
839861 total_vulns = vulns
840862 if limit:
841863 vulns = vulns.limit(limit)
853875 workspace,
854876 marshmallow_params,
855877 )
856 column_names = ['count'] + [field['field'] for field in filters.get('group_by',[])]
878 column_names = ['count'] + [field['field'] for field in filters.get('group_by', [])]
857879 rows = [list(zip(column_names, row)) for row in vulns.all()]
858880 vulns_data = []
859881 for row in rows:
860 vulns_data.append({field[0]:field[1] for field in row})
882 vulns_data.append({field[0]: field[1] for field in row})
861883
862884 return vulns_data, len(rows)
863885
882904
883905 if vuln_workspace_check:
884906 file_obj = db.session.query(File).filter_by(object_type='vulnerability',
885 object_id=vuln_id,
886 filename=attachment_filename.replace(" ", "%20")).first()
907 object_id=vuln_id,
908 filename=attachment_filename.replace(" ", "%20")).first()
887909 if file_obj:
888910 depot = DepotManager.get()
889911 depot_file = depot.get(file_obj.content.get('file_id'))
932954 Workspace.name == workspace.name).first()
933955 if vuln_workspace_check:
934956 files = db.session.query(File).filter_by(object_type='vulnerability',
935 object_id=vuln_id).all()
957 object_id=vuln_id).all()
936958 res = {}
937959 for file_obj in files:
938960 ret = EvidenceSchema().dump(file_obj)
941963 return flask.jsonify(res)
942964 else:
943965 flask.abort(404, "Vulnerability not found")
944
945966
946967 @route('/<int:vuln_id>/attachment/<attachment_filename>/', methods=['DELETE'])
947968 def delete_attachment(self, workspace_name, vuln_id, attachment_filename):
10101031 as_attachment=True,
10111032 cache_timeout=-1)
10121033
1013
10141034 @route('bulk_delete/', methods=['DELETE'])
10151035 def bulk_delete(self, workspace_name):
10161036 """
10391059 if vulnerability_ids:
10401060 logger.info("Delete Vuln IDs: %s", vulnerability_ids)
10411061 vulns = VulnerabilityGeneric.query.filter(VulnerabilityGeneric.id.in_(vulnerability_ids),
1042 VulnerabilityGeneric.workspace_id == workspace.id)
1062 VulnerabilityGeneric.workspace_id == workspace.id)
10431063 elif vulnerability_severities:
10441064 logger.info("Delete Vuln Severities: %s", vulnerability_severities)
10451065 vulns = VulnerabilityGeneric.query.filter(VulnerabilityGeneric.severity.in_(vulnerability_severities),
10711091 """
10721092 limit = flask.request.args.get('limit', 1)
10731093 workspace = self._get_workspace(workspace_name)
1074 data = db.session.query(User, func.count(VulnerabilityGeneric.id)).join(VulnerabilityGeneric.creator)\
1075 .filter(VulnerabilityGeneric.workspace_id == workspace.id).group_by(User.id)\
1094 data = db.session.query(User, func.count(VulnerabilityGeneric.id)).join(VulnerabilityGeneric.creator) \
1095 .filter(VulnerabilityGeneric.workspace_id == workspace.id).group_by(User.id) \
10761096 .order_by(desc(func.count(VulnerabilityGeneric.id))).limit(int(limit)).all()
10771097 users = []
10781098 for item in data:
44 import flask
55 from flask import Blueprint
66 from flask import current_app as app
7 from flask_classful import route
78 from itsdangerous import BadData, TimestampSigner
89 from marshmallow import Schema
910 from sqlalchemy.orm.exc import NoResultFound
2425 route_base = 'websocket_token'
2526 schema_class = WebsocketWorkspaceAuthSchema
2627
27 def post(self, workspace_name):
28 @route('/', methods=['GET', 'POST'])
29 def get(self, workspace_name):
2830 """
2931 ---
30 post:
32 get:
3133 tags: ["Token"]
3234 responses:
3335 200:
4244 class WebsocketWorkspaceAuthV3View(WebsocketWorkspaceAuthView):
4345 route_prefix = "/v3/ws/<workspace_name>/"
4446 trailing_slash = False
47
48 @route('', methods=['GET', 'POST'])
49 def get(self, workspace_name):
50 """
51 ---
52 get:
53 tags: ["Token"]
54 responses:
55 200:
56 description: Ok
57 """
58 return super().get(workspace_name)
4559
4660
4761 WebsocketWorkspaceAuthView.register(websocket_auth_api)
2828 logger = logging.getLogger(__name__)
2929
3030 workspace_api = Blueprint('workspace_api', __name__)
31
3231
3332
3433 class WorkspaceSummarySchema(Schema):
6665 class WorkspaceSchema(AutoSchema):
6766
6867 name = fields.String(required=True,
69 validate=validate.Regexp(r"^[a-z0-9][a-z0-9\_\$\(\)\+\-\/]*$", 0,
68 validate=validate.Regexp(r"^[a-z0-9][a-z0-9\_\$\(\)\+\-]*$", 0,
7069 error="The workspace name must validate with the regex "
7170 "^[a-z0-9][a-z0-9\\_\\$\\(\\)\\+\\-\\/]*$"))
7271 stats = SelfNestedField(WorkspaceSummarySchema())
8584 dump_only=True)
8685
8786 active_agents_count = fields.Integer(dump_only=True)
88
8987
9088 class Meta:
9189 model = Workspace
44 import string
55 import datetime
66
7 import bleach
78 import pyotp
89 import requests
910 from flask_limiter import Limiter
1617 from configparser import ConfigParser, NoSectionError, NoOptionError, DuplicateSectionError
1718
1819 import flask
20 import flask_login
1921 from flask import Flask, session, g, request
2022 from flask.json import JSONEncoder
2123 from flask_sqlalchemy import get_debug_queries
4345 from faraday.server.utils.logger import LOGGING_HANDLERS
4446 from faraday.server.utils.invalid_chars import remove_null_caracters
4547 from faraday.server.config import CONST_FARADAY_HOME_PATH
46
4748
4849 logger = logging.getLogger(__name__)
4950 audit_logger = logging.getLogger('audit')
6869
6970
7071 def register_blueprints(app):
71
72 from faraday.server.api.modules.info import info_api # pylint:disable=import-outside-toplevel
73 from faraday.server.api.modules.commandsrun import commandsrun_api # pylint:disable=import-outside-toplevel
74 from faraday.server.api.modules.activity_feed import activityfeed_api # pylint:disable=import-outside-toplevel
75 from faraday.server.api.modules.credentials import credentials_api # pylint:disable=import-outside-toplevel
76 from faraday.server.api.modules.hosts import host_api # pylint:disable=import-outside-toplevel
77 from faraday.server.api.modules.licenses import license_api # pylint:disable=import-outside-toplevel
78 from faraday.server.api.modules.services import services_api # pylint:disable=import-outside-toplevel
79 from faraday.server.api.modules.session import session_api # pylint:disable=import-outside-toplevel
80 from faraday.server.api.modules.vulns import vulns_api # pylint:disable=import-outside-toplevel
81 from faraday.server.api.modules.vulnerability_template import vulnerability_template_api # pylint:disable=import-outside-toplevel
82 from faraday.server.api.modules.workspaces import workspace_api # pylint:disable=import-outside-toplevel
83 from faraday.server.api.modules.handlers import handlers_api # pylint:disable=import-outside-toplevel
84 from faraday.server.api.modules.comments import comment_api # pylint:disable=import-outside-toplevel
85 from faraday.server.api.modules.upload_reports import upload_api # pylint:disable=import-outside-toplevel
86 from faraday.server.api.modules.websocket_auth import websocket_auth_api # pylint:disable=import-outside-toplevel
87 from faraday.server.api.modules.get_exploits import exploits_api # pylint:disable=import-outside-toplevel
88 from faraday.server.api.modules.custom_fields import custom_fields_schema_api # pylint:disable=import-outside-toplevel
89 from faraday.server.api.modules.agent_auth_token import agent_auth_token_api # pylint:disable=import-outside-toplevel
90 from faraday.server.api.modules.agent import agent_api # pylint:disable=import-outside-toplevel
91 from faraday.server.api.modules.bulk_create import bulk_create_api # pylint:disable=import-outside-toplevel
92 from faraday.server.api.modules.token import token_api # pylint:disable=import-outside-toplevel
93 from faraday.server.api.modules.search_filter import searchfilter_api # pylint:disable=import-outside-toplevel
72 from faraday.server.api.modules.info import info_api # pylint:disable=import-outside-toplevel
73 from faraday.server.api.modules.commandsrun import commandsrun_api # pylint:disable=import-outside-toplevel
74 from faraday.server.api.modules.activity_feed import activityfeed_api # pylint:disable=import-outside-toplevel
75 from faraday.server.api.modules.credentials import credentials_api # pylint:disable=import-outside-toplevel
76 from faraday.server.api.modules.hosts import host_api # pylint:disable=import-outside-toplevel
77 from faraday.server.api.modules.licenses import license_api # pylint:disable=import-outside-toplevel
78 from faraday.server.api.modules.services import services_api # pylint:disable=import-outside-toplevel
79 from faraday.server.api.modules.session import session_api # pylint:disable=import-outside-toplevel
80 from faraday.server.api.modules.vulns import vulns_api # pylint:disable=import-outside-toplevel
81 from faraday.server.api.modules.vulnerability_template import \
82 vulnerability_template_api # pylint:disable=import-outside-toplevel
83 from faraday.server.api.modules.workspaces import workspace_api # pylint:disable=import-outside-toplevel
84 from faraday.server.api.modules.handlers import handlers_api # pylint:disable=import-outside-toplevel
85 from faraday.server.api.modules.comments import comment_api # pylint:disable=import-outside-toplevel
86 from faraday.server.api.modules.upload_reports import upload_api # pylint:disable=import-outside-toplevel
87 from faraday.server.api.modules.websocket_auth import websocket_auth_api # pylint:disable=import-outside-toplevel
88 from faraday.server.api.modules.get_exploits import exploits_api # pylint:disable=import-outside-toplevel
89 from faraday.server.api.modules.custom_fields import \
90 custom_fields_schema_api # pylint:disable=import-outside-toplevel
91 from faraday.server.api.modules.agent_auth_token import \
92 agent_auth_token_api # pylint:disable=import-outside-toplevel
93 from faraday.server.api.modules.agent import agent_api # pylint:disable=import-outside-toplevel
94 from faraday.server.api.modules.bulk_create import bulk_create_api # pylint:disable=import-outside-toplevel
95 from faraday.server.api.modules.token import token_api # pylint:disable=import-outside-toplevel
96 from faraday.server.api.modules.search_filter import searchfilter_api # pylint:disable=import-outside-toplevel
9497 from faraday.server.api.modules.preferences import preferences_api # pylint:disable=import-outside-toplevel
9598 from faraday.server.api.modules.export_data import export_data_api # pylint:disable=import-outside-toplevel
96 #Custom reset password
97 from faraday.server.api.modules.auth import auth # pylint:disable=import-outside-toplevel
99 # Custom reset password
100 from faraday.server.api.modules.auth import auth # pylint:disable=import-outside-toplevel
98101
99102 app.register_blueprint(commandsrun_api)
100103 app.register_blueprint(activityfeed_api)
145148 try:
146149 data = serialized.loads(token)
147150 user_id = data["user_id"]
148 user = User.query.filter_by(id=user_id).first()
151 user = User.query.filter_by(fs_uniquifier=user_id).first()
149152 if not user or not verify_hash(data['validation_check'], user.password):
150153 logger.warn('Invalid authentication token. token invalid after password change')
151154 return None
155158 except BadSignature:
156159 return None # invalid token
157160
158
159 @app.before_request
160 def default_login_required(): # pylint:disable=unused-variable
161 view = app.view_functions.get(flask.request.endpoint)
162
161 @app.login_manager.request_loader
162 def load_user_from_request(request):
163163 if app.config['SECURITY_TOKEN_AUTHENTICATION_HEADER'] in flask.request.headers:
164164 header = flask.request.headers[app.config['SECURITY_TOKEN_AUTHENTICATION_HEADER']]
165165 try:
173173 if not user:
174174 logger.warn('Invalid authentication token.')
175175 flask.abort(401)
176 logged_in = True
176 else:
177 return user
177178 elif auth_type == 'agent':
178179 # Don't handle the agent logic here, do it in another
179180 # before_request handler
180 logged_in = False
181 return None
182 elif auth_type == "basic":
183 username = flask.request.authorization.get('username', '')
184 password = flask.request.authorization.get('password', '')
185 user = User.query.filter_by(username=username).first()
186 if user and user.verify_and_update_password(password):
187 return user
181188 else:
182189 logger.warn("Invalid authorization type")
183190 flask.abort(401)
184 else:
185 # TODO use public flask_login functions
186 logged_in = '_user_id' in flask.session
187 user_id = session.get("_user_id")
188 if logged_in:
189 user = User.query.filter_by(id=user_id).first()
190
191 if logged_in:
192 assert user
193
194 if not logged_in and not getattr(view, 'is_public', False) \
191
192 # finally, return None if both methods did not login the user
193 return None
194
195 @app.before_request
196 def default_login_required(): # pylint:disable=unused-variable
197 view = app.view_functions.get(flask.request.endpoint)
198
199 if flask_login.current_user.is_anonymous and not getattr(view, 'is_public', False) \
195200 and flask.request.method != 'OPTIONS':
196201 flask.abort(401)
197202
198 g.user = None
199 if logged_in:
200 g.user = user
201 if user is None:
202 logger.warn(f"Unknown user id {session['_user_id']}")
203 del flask.session['_user_id']
204 flask.abort(401) # 403 would be better but breaks the web ui
205 return
206
207203 @app.before_request
208 def load_g_custom_fields(): # pylint:disable=unused-variable
204 def load_g_custom_fields(): # pylint:disable=unused-variable
209205 g.custom_fields = {}
210206
211207 @app.after_request
212 def log_queries_count(response): # pylint:disable=unused-variable
208 def log_queries_count(response): # pylint:disable=unused-variable
213209 if flask.request.method not in ['GET', 'HEAD']:
214210 # We did most optimizations for read only endpoints
215211 # TODO migrations: improve optimization and remove this if
284280 audit_logger.info(f"User [{user.username}] logged in from IP [{user_ip}] at [{user_login_at}]")
285281
286282
283 def uia_username_mapper(identity):
284 return bleach.clean(identity, strip=True)
285
286
287287 def create_app(db_connection_string=None, testing=None):
288
289288 class CustomFlask(Flask):
290289 SKIP_RULES = [ # These endpoints will be removed for v3
291290 '/v3/ws/<workspace_name>/hosts/bulk_delete/',
328327 'SECURITY_BACKWARDS_COMPAT_AUTH_TOKEN': True,
329328 'SECURITY_PASSWORD_SINGLE_HASH': True,
330329 'WTF_CSRF_ENABLED': False,
331 'SECURITY_USER_IDENTITY_ATTRIBUTES': ['username'],
330 'SECURITY_USER_IDENTITY_ATTRIBUTES': [{'username': {'mapper': uia_username_mapper}}],
332331 'SECURITY_POST_LOGIN_VIEW': '/_api/session',
333332 'SECURITY_POST_CHANGE_VIEW': '/_api/change',
334333 'SECURITY_RESET_PASSWORD_TEMPLATE': '/security/reset.html',
335334 'SECURITY_POST_RESET_VIEW': '/',
336 'SECURITY_SEND_PASSWORD_RESET_EMAIL':True,
337 #For testing porpouse
335 'SECURITY_SEND_PASSWORD_RESET_EMAIL': True,
336 # For testing porpouse
338337 'SECURITY_EMAIL_SENDER': "[email protected]",
339338 'SECURITY_CHANGEABLE': True,
340339 'SECURITY_SEND_PASSWORD_CHANGE_EMAIL': False,
341340 'SECURITY_MSG_USER_DOES_NOT_EXIST': login_failed_message,
342341 'SECURITY_TOKEN_AUTHENTICATION_HEADER': 'Authorization',
343
344342
345343 # The line bellow should not be necessary because of the
346344 # CustomLoginForm, but i'll include it anyway.
360358 # 'sha256_crypt',
361359 # 'sha512_crypt',
362360 ],
363 'PERMANENT_SESSION_LIFETIME': datetime.timedelta(hours=int(faraday.server.config.faraday_server.session_timeout or 12)),
361 'PERMANENT_SESSION_LIFETIME': datetime.timedelta(
362 hours=int(faraday.server.config.faraday_server.session_timeout or 12)),
364363 'SESSION_COOKIE_NAME': 'faraday_session_2',
365364 'SESSION_COOKIE_SAMESITE': 'Lax',
366365 })
373372
374373 storage_path = faraday.server.config.storage.path
375374 if not storage_path:
376 logger.warn('No storage section or path in the .faraday/config/server.ini. Setting the default value to .faraday/storage')
375 logger.warn(
376 'No storage section or path in the .faraday/config/server.ini. Setting the default value to .faraday/storage')
377377 storage_path = setup_storage_path()
378378
379379 if not DepotManager.get('default'):
389389 check_testing_configuration(testing, app)
390390
391391 try:
392 app.config['SQLALCHEMY_DATABASE_URI'] = db_connection_string or faraday.server.config.database.connection_string.strip("'")
392 app.config[
393 'SQLALCHEMY_DATABASE_URI'] = db_connection_string or faraday.server.config.database.connection_string.strip(
394 "'")
393395 except AttributeError:
394 logger.info('Missing [database] section on server.ini. Please configure the database before running the server.')
396 logger.info(
397 'Missing [database] section on server.ini. Please configure the database before running the server.')
395398 except NoOptionError:
396 logger.info('Missing connection_string on [database] section on server.ini. Please configure the database before running the server.')
397
398 from faraday.server.models import db # pylint:disable=import-outside-toplevel
399 logger.info(
400 'Missing connection_string on [database] section on server.ini. Please configure the database before running the server.')
401
402 from faraday.server.models import db # pylint:disable=import-outside-toplevel
399403 db.init_app(app)
400 #Session(app)
404 # Session(app)
401405
402406 # Setup Flask-Security
403407 app.user_datastore = SQLAlchemyUserDatastore(
467471 return False
468472 self.email.data = remove_null_caracters(self.email.data)
469473
470 self.user = _datastore.get_user(self.email.data)
474 self.user = _datastore.find_user(username=self.email.data)
471475
472476 if self.user is None:
473477 audit_logger.warning(f"Invalid Login - User [{self.email.data}] from IP [{user_ip}] at [{time_now}] - "
33 See the file 'doc/LICENSE' for the license information
44
55 """
6
7 # I'm Py3
77 from apispec import APISpec
88 from apispec.ext.marshmallow import MarshmallowPlugin
99 from apispec_webframeworks.flask import FlaskPlugin
10 from faraday.server.web import app
10 from faraday.server.web import get_app
1111 import json
1212
1313 from faraday.utils.faraday_openapi_plugin import FaradayAPIPlugin
4848
4949 tags = set()
5050
51 with app.test_request_context():
52 for endpoint in app.view_functions.values():
53 spec.path(view=endpoint, app=app)
51 with get_app().test_request_context():
52 for endpoint in get_app().view_functions.values():
53 spec.path(view=endpoint, app=get_app())
5454
5555 # Set up global tags
5656 spec_yaml = yaml.load(spec.to_yaml(), Loader=yaml.SafeLoader)
7272
7373
7474 def show_all_urls():
75 print(app.url_map)
75 print(get_app().url_map)
0 from faraday.server.web import app
0 from faraday.server.web import get_app
11 from faraday.server.models import User, db
22 from flask_security.utils import hash_password
33
44
55 def changes_password(username, password):
6 with app.app_context():
6 with get_app().app_context():
77 user = User.query.filter_by(username=username).first()
88 if user:
99 user.password = hash_password(password)
1212 print("Password changed succesfully")
1313 else:
1414 print("User not found in Faraday's Database")
15 # I'm Py3
00 import sys
11 import click
22
3 from faraday.server.web import app
3 from faraday.server.web import get_app
44 from faraday.server.models import User, db
55
66
77 def change_username(current_username, new_username):
8 with app.app_context():
8 with get_app().app_context():
99 user = User.query.filter_by(username=current_username).first()
1010 if not user:
1111 print(f"\nERROR: User {current_username} was not found in Faraday's Database.")
00 import sys
11 import click
22
3 from faraday.server.web import app
3 from faraday.server.web import get_app
44 from faraday.server.models import (
55 db,
66 CustomFieldsSchema
99
1010
1111 def add_custom_field_main():
12 with app.app_context():
12 with get_app().app_context():
1313 add_custom_field_wizard()
1414
1515
1616 def delete_custom_field_main():
17 with app.app_context():
17 with get_app().app_context():
1818 delete_custom_field_wizard()
1919
2020
4040 field_type = click.prompt('Field type (int, str, list)', type=click.Choice(['int', 'str', 'list']))
4141 custom_fields = db.session.query(CustomFieldsSchema)
4242
43 #Checks the name of the fields wont be a duplicate
43 # Checks the name of the fields wont be a duplicate
4444 for custom_field in custom_fields:
4545 if field_name == custom_field.field_name \
46 or field_display_name == custom_field.field_display_name:
46 or field_display_name == custom_field.field_display_name:
4747 print('Custom field already exists, skipping')
4848 sys.exit(1)
4949
7070 invalid_field_order = True
7171 continue
7272 invalid_field_order = False
73 confirmation = click.prompt('New CustomField will be added to vulnerability -> Order {order} ({0},{1},{2}) <-, confirm to continue (yes/no)'\
74 .format(field_name, field_display_name, field_type, order=field_order))
73 confirmation = click.prompt('New CustomField will be added to vulnerability -> Order {order} ({0},{1},{2}) <-'
74 ', confirm to continue (yes/no)'
75 .format(field_name, field_display_name, field_type, order=field_order))
7576 if not confirmation:
7677 sys.exit(1)
7778
7879 custom_field_data, created = get_or_create(
79 db.session,
80 CustomFieldsSchema,
81 table_name='vulnerability',
82 field_name=field_name,
83 field_order=field_order,
80 db.session,
81 CustomFieldsSchema,
82 table_name='vulnerability',
83 field_name=field_name,
84 field_order=field_order,
8485 )
8586 if not created:
8687 print('Custom field already exists, skipping')
2424 def _draw_entity_diagrama(self):
2525 # create the pydot graph object by autoloading all tables via a bound metadata object
2626 try:
27 from sqlalchemy_schemadisplay import create_schema_graph # pylint:disable=import-outside-toplevel
27 from sqlalchemy_schemadisplay import create_schema_graph # pylint:disable=import-outside-toplevel
2828 except ImportError:
2929 print('Please install sqlalchemy_schemadisplay with "pip install sqlalchemy_schemadisplay"')
3030 sys.exit(1)
4343 sys.exit(1)
4444 raise
4545
46
4746 def _draw_uml_class_diagram(self):
4847 # lets find all the mappers in our model
4948 try:
50 from sqlalchemy_schemadisplay import create_uml_graph # pylint:disable=import-outside-toplevel
49 from sqlalchemy_schemadisplay import create_uml_graph # pylint:disable=import-outside-toplevel
5150 except ImportError:
5251 print('Please install sqlalchemy_schemadisplay with "pip install sqlalchemy_schemadisplay"')
5352 sys.exit(1)
99
1010 from sqlalchemy.exc import IntegrityError
1111
12 from faraday.server.web import app
12 from faraday.server.web import get_app
1313 from faraday.server.models import (
1414 db,
1515 VulnerabilityTemplate,
2727 def import_vulnerability_templates(language):
2828 imported_rows = 0
2929 duplicated_rows = 0
30 with app.app_context():
30 with get_app().app_context():
3131 try:
3232 res = requests.get(f'{CWE_URL}/cwe_{language}.csv')
3333 except Exception as e:
77
88 import getpass
99 import string
10
10 import uuid
1111 import os
1212 import sys
1313 import click
1616 from alembic import command
1717 from random import SystemRandom
1818 from tempfile import TemporaryFile
19 from subprocess import Popen # nosec
19 from subprocess import Popen # nosec
2020
2121 import sqlalchemy
2222 from sqlalchemy import create_engine
4747 config.get('database', 'connection_string')
4848 reconfigure = None
4949 while not reconfigure:
50 reconfigure = input(f'Database section {Fore.YELLOW} already found{Fore.WHITE}. Do you want to reconfigure database? (yes/no) ')
50 reconfigure = input(
51 f'Database section {Fore.YELLOW} already found{Fore.WHITE}. Do you want to reconfigure database? (yes/no) ')
5152 if reconfigure.lower() == 'no':
5253 return False
5354 elif reconfigure.lower() == 'yes':
118119 else:
119120 user_password = self.generate_random_pw(12)
120121 already_created = False
122 fs_uniquifier = str(uuid.uuid4())
121123 try:
122124
123125 statement = text("""
124126 INSERT INTO faraday_user (
125127 username, name, password,
126128 is_ldap, active, last_login_ip,
127 current_login_ip, role, state_otp
129 current_login_ip, role, state_otp, fs_uniquifier
128130 ) VALUES (
129131 'faraday', 'Administrator', :password,
130132 false, true, '127.0.0.1',
131 '127.0.0.1', 'admin', 'disabled'
133 '127.0.0.1', 'admin', 'disabled', :fs_uniquifier
132134 )
133135 """)
134136 params = {
135 'password': hash_password(user_password)
137 'password': hash_password(user_password),
138 'fs_uniquifier': fs_uniquifier
136139 }
137140 connection = engine.connect()
138141 connection.execute(statement, **params)
141144 # when re using database user could be created previously
142145 already_created = True
143146 print(
144 "{yellow}WARNING{white}: Faraday administrator user already exists.".format(
145 yellow=Fore.YELLOW, white=Fore.WHITE))
147 "{yellow}WARNING{white}: Faraday administrator user already exists.".format(
148 yellow=Fore.YELLOW, white=Fore.WHITE))
146149 else:
147150 print(
148151 "{yellow}WARNING{white}: Can't create administrator user.".format(
152155 print("Admin user created with \n\n{red}username: {white}faraday \n"
153156 "{red}password:{white} {"
154157 "user_password} \n".format(user_password=user_password,
155 white=Fore.WHITE, red=Fore.RED))
158 white=Fore.WHITE, red=Fore.RED))
156159
157160 def _configure_existing_postgres_user(self):
158161 username = input('Please enter the postgresql username: ')
166169 if 'unknown user: postgres' in psql_output:
167170 print(f'ERROR: Postgres user not found. Did you install package {Fore.BLUE}postgresql{Fore.WHITE}?')
168171 elif 'could not connect to server' in psql_output:
169 print(f'ERROR: {Fore.RED}PostgreSQL service{Fore.WHITE} is not running. Please verify that it is running in port 5432 before executing setup script.')
172 print(
173 f'ERROR: {Fore.RED}PostgreSQL service{Fore.WHITE} is not running. Please verify that it is running in port 5432 before executing setup script.')
170174 elif process_status > 0:
171175 current_psql_output_file.seek(0)
172176 print('ERROR: ' + psql_output)
173177
174178 if process_status != 0:
175 current_psql_output_file.close() # delete temp file
179 current_psql_output_file.close() # delete temp file
176180 sys.exit(process_status)
177181
178182 def generate_random_pw(self, pwlen):
184188 This step will create the role on the database.
185189 we return username and password and those values will be saved in the config file.
186190 """
187 print('This script will {blue} create a new postgres user {white} and {blue} save faraday-server settings {white}(server.ini). '.format(blue=Fore.BLUE, white=Fore.WHITE))
188 username = os.environ.get("FARADAY_DATABASE_USER", 'faraday_postgresql')
191 print(
192 'This script will {blue} create a new postgres user {white} and {blue} save faraday-server settings {white}(server.ini). '.format(
193 blue=Fore.BLUE, white=Fore.WHITE))
194 username = os.environ.get("FARADAY_DATABASE_USER", 'faraday_postgresql')
189195 postgres_command = ['sudo', '-u', 'postgres', 'psql']
190196 if sys.platform == 'darwin':
191197 print(f'{Fore.BLUE}MAC OS detected{Fore.WHITE}')
192198 postgres_command = ['psql', 'postgres']
193199 password = self.generate_random_pw(25)
194 command = postgres_command + [ '-c', 'CREATE ROLE {0} WITH LOGIN PASSWORD \'{1}\';'.format(username, password)]
195 p = Popen(command, stderr=psql_log_file, stdout=psql_log_file) # nosec
200 command = postgres_command + ['-c', 'CREATE ROLE {0} WITH LOGIN PASSWORD \'{1}\';'.format(username, password)]
201 p = Popen(command, stderr=psql_log_file, stdout=psql_log_file) # nosec
196202 p.wait()
197203 psql_log_file.seek(0)
198204 output = psql_log_file.read()
205211
206212 try:
207213 if not getattr(faraday.server.config, 'database', None):
208 print('Manual configuration? \n faraday_postgresql was found in PostgreSQL, but no connection string was found in server.ini. ')
209 print('Please configure [database] section with correct postgresql string. Ex. postgresql+psycopg2://faraday_postgresql:PASSWORD@localhost/faraday')
214 print(
215 'Manual configuration? \n faraday_postgresql was found in PostgreSQL, but no connection string was found in server.ini. ')
216 print(
217 'Please configure [database] section with correct postgresql string. Ex. postgresql+psycopg2://faraday_postgresql:PASSWORD@localhost/faraday')
210218 sys.exit(1)
211219 try:
212220 password = faraday.server.config.database.connection_string.split(':')[2].split('@')[0]
213221 except AttributeError:
214222 print('Could not find connection string.')
215 print('Please configure [database] section with correct postgresql string. Ex. postgresql+psycopg2://faraday_postgresql:PASSWORD@localhost/faraday')
223 print(
224 'Please configure [database] section with correct postgresql string. Ex. postgresql+psycopg2://faraday_postgresql:PASSWORD@localhost/faraday')
216225 sys.exit(1)
217226 connection = psycopg2.connect(dbname='postgres',
218227 user=username,
244253
245254 print(f'Creating database {database_name}')
246255 command = postgres_command + ['createdb', '-E', 'utf8', '-O', username, database_name]
247 p = Popen(command, stderr=psql_log_file, stdout=psql_log_file, cwd='/tmp') # nosec
256 p = Popen(command, stderr=psql_log_file, stdout=psql_log_file, cwd='/tmp') # nosec
248257 p.wait()
249258 return_code = p.returncode
250259 psql_log_file.seek(0)
274283
275284 def _create_tables(self, conn_string):
276285 print('Creating tables')
277 from faraday.server.models import db # pylint:disable=import-outside-toplevel
286 from faraday.server.models import db # pylint:disable=import-outside-toplevel
278287 current_app.config['SQLALCHEMY_DATABASE_URI'] = conn_string
279288
280289 # Check if the alembic_version exists
292301 db.create_all()
293302 except OperationalError as ex:
294303 if 'could not connect to server' in str(ex):
295 print(f'ERROR: {Fore.RED}PostgreSQL service{Fore.WHITE} is not running. Please verify that it is running in port 5432 before executing setup script.')
304 print(
305 f'ERROR: {Fore.RED}PostgreSQL service{Fore.WHITE} is not running. Please verify that it is running in port 5432 before executing setup script.')
296306 sys.exit(1)
297307 elif 'password authentication failed' in str(ex):
298308 print('ERROR: ')
77 import click
88
99 from faraday.server.models import db
10 from faraday.server.web import app
10 from faraday.server.web import get_app
1111 from faraday.server.commands.initdb import InitDB
1212 import faraday.server.config
1313
3030
3131
3232 def reset_db():
33 with app.app_context():
33 with get_app().app_context():
3434 reset_db_all()
3535
3636
1010 from colorama import Fore
1111
1212 import faraday.server.config
13 from faraday.server.web import app
13 from faraday.server.web import get_app
1414 from faraday.server.models import db
1515 from faraday.server.config import CONST_FARADAY_HOME_PATH
1616 from faraday.server.utils.daemonize import is_server_running
2626
2727
2828 def check_open_ports():
29 address = faraday.server.config.faraday_server.bind_address
29 address = faraday.server.config.faraday_server.bind_address
3030 port = int(faraday.server.config.faraday_server.port)
3131 sock = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
3232 result = sock.connect_ex((address, port))
3737
3838
3939 def check_postgres():
40 with app.app_context():
40 with get_app().app_context():
4141 try:
42 result = (db.session.query("version()").one(),db.session.query("current_setting('server_version_num')").one())
42 result = (
43 db.session.query("version()").one(), db.session.query("current_setting('server_version_num')").one())
4344 return result
4445 except sqlalchemy.exc.OperationalError:
4546 return False
4849
4950
5051 def check_locks_postgresql():
51 with app.app_context():
52 with get_app().app_context():
5253 psql_status = check_postgres()
5354 if psql_status:
5455 result = db.engine.execute("""SELECT blocked_locks.pid AS blocked_pid,
8485
8586
8687 def check_postgresql_encoding():
87 with app.app_context():
88 with get_app().app_context():
8889 psql_status = check_postgres()
8990 if psql_status:
9091 encoding = db.engine.execute("SHOW SERVER_ENCODING").first()[0]
9495
9596
9697 def check_storage_permission():
97
9898 path = CONST_FARADAY_HOME_PATH / 'storage' / 'test'
9999
100100 try:
112112 data_keys = ['bind_address', 'port', 'websocket_port', 'debug']
113113 for key in data_keys:
114114 print('{blue} {KEY}: {white}{VALUE}'.
115 format(KEY=key, VALUE=getattr(faraday.server.config.faraday_server, key), white=Fore.WHITE, blue=Fore.BLUE))
115 format(KEY=key, VALUE=getattr(faraday.server.config.faraday_server, key), white=Fore.WHITE,
116 blue=Fore.BLUE))
116117
117118 print(f'\n{Fore.WHITE}Showing faraday plugins data')
118119 print(f"{Fore.BLUE} version: {Fore.WHITE}{faraday_plugins.__version__}")
135136 exit_code = 0
136137 result = check_postgres()
137138
138
139 if result == False:
140 print('[{red}-{white}] Could not connect to PostgreSQL, please check if database is running'\
141 .format(red=Fore.RED, white=Fore.WHITE))
139 if not result:
140 print('[{red}-{white}] Could not connect to PostgreSQL, please check if database is running'
141 .format(red=Fore.RED, white=Fore.WHITE))
142142 exit_code = 1
143143 return exit_code
144 elif result == None:
145 print('[{red}-{white}] Database not initialized. Execute: faraday-manage initdb'\
146 .format(red=Fore.RED, white=Fore.WHITE))
144 elif result is None:
145 print('[{red}-{white}] Database not initialized. Execute: faraday-manage initdb'
146 .format(red=Fore.RED, white=Fore.WHITE))
147147 exit_code = 1
148148 return exit_code
149 elif int(result[1][0])<90400:
150 print('[{red}-{white}] PostgreSQL is running, but needs to be 9.4 or newer, please update PostgreSQL'.\
151 format(red=Fore.RED, white=Fore.WHITE))
149 elif int(result[1][0]) < 90400:
150 print('[{red}-{white}] PostgreSQL is running, but needs to be 9.4 or newer, please update PostgreSQL'
151 .format(red=Fore.RED, white=Fore.WHITE))
152152 elif result:
153153 print(f'[{Fore.GREEN}+{Fore.WHITE}] PostgreSQL is running and up to date')
154154 return exit_code
161161 lock_status = check_locks_postgresql()
162162 if lock_status:
163163 print(f'[{Fore.YELLOW}-{Fore.WHITE}] Warning: PostgreSQL lock detected.')
164 elif lock_status == False:
164 elif not lock_status:
165165 print(f'[{Fore.GREEN}+{Fore.WHITE}] PostgreSQL lock not detected. ')
166 elif lock_status == None:
166 elif lock_status is None:
167167 pass
168168
169169 encoding = check_postgresql_encoding()
170170 if encoding:
171171 print(f'[{Fore.GREEN}+{Fore.WHITE}] PostgreSQL encoding: {encoding}')
172 elif encoding == None:
172 elif encoding is None:
173173 pass
174174
175175
176176 def print_faraday_status():
177177 """Prints Status of farday using check_server_running() """
178178
179 #Prints Status of the server using check_server_running()
179 # Prints Status of the server using check_server_running()
180180 pid = check_server_running()
181181 if pid is not None:
182182 print('[{green}+{white}] Faraday Server is running. PID:{PID} \
198198 print(f'[{Fore.RED}-{Fore.WHITE}] /.faraday/storage -> Permission denied')
199199
200200 if check_open_ports():
201 print("[{green}+{white}] Port {PORT} in {ad} is open"\
202 .format(PORT=faraday.server.config.faraday_server.port, green=Fore.GREEN,white=Fore.WHITE,ad=faraday.server.config.faraday_server.bind_address))
203 else:
204 print("[{red}-{white}] Port {PORT} in {ad} is not open"\
205 .format(PORT=faraday.server.config.faraday_server.port,red=Fore.RED,white=Fore.WHITE,ad =faraday.server.config.faraday_server.bind_address))
201 print("[{green}+{white}] Port {PORT} in {ad} is open"
202 .format(PORT=faraday.server.config.faraday_server.port,
203 green=Fore.GREEN, white=Fore.WHITE, ad=faraday.server.config.faraday_server.bind_address))
204 else:
205 print("[{red}-{white}] Port {PORT} in {ad} is not open"
206 .format(PORT=faraday.server.config.faraday_server.port,
207 red=Fore.RED, white=Fore.WHITE, ad=faraday.server.config.faraday_server.bind_address))
206208
207209
208210 def full_status_check():
144144 self.session_timeout = 12
145145 self.api_token_expiration = 43200 # Default as 12 hs
146146 self.agent_registration_secret = None
147 self.agent_token_expiration = 60 # Default as 1 min
147148 self.debug = False
148149 self.custom_plugins_folder = None
149150 self.ignore_info_severity = False
163164 self.use_start_tls = None
164165
165166
166
167167 class SmtpConfigObject(ConfigSection):
168168 def __init__(self):
169169 self.username = None
188188 class LoggerConfig(ConfigSection):
189189 def __init__(self):
190190 self.use_rfc5424_formatter = False
191
191192
192193 database = DatabaseConfigObject()
193194 dashboard = DashboardConfigObject()
6767 changes_queue.put(msg)
6868
6969
70
7170 def update_object_event(mapper, connection, instance):
7271 delta = instance.update_date - instance.create_date
7372 if delta.seconds < 30:
8786
8887 def after_insert_check_child_has_same_workspace(mapper, connection, inserted_instance):
8988 if inserted_instance.parent:
90 assert (inserted_instance.workspace ==
91 inserted_instance.parent.workspace), \
89 assert (inserted_instance.workspace
90 == inserted_instance.parent.workspace), \
9291 "Conflicting workspace assignation for objects. " \
9392 "This should never happen!!!"
9493
95
96
97 assert (inserted_instance.workspace_id ==
98 inserted_instance.parent.workspace_id), \
94 assert (inserted_instance.workspace_id
95 == inserted_instance.parent.workspace_id), \
9996 "Conflicting workspace_id assignation for objects. " \
10097 "This should never happen!!!"
10198
105102 if inspect.isclass(obj) and getattr(obj, 'workspace_id', None):
106103 event.listen(obj, 'after_insert', after_insert_check_child_has_same_workspace)
107104 event.listen(obj, 'after_update', after_insert_check_child_has_same_workspace)
108
109
110105
111106
112107 # Events for websockets
120115 # Update object bindings
121116 event.listen(Host, 'after_update', update_object_event)
122117 event.listen(Service, 'after_update', update_object_event)
123 # I'm Py3
00 # Faraday Penetration Test IDE
11 # Copyright (C) 2016 Infobyte LLC (http://www.infobytesec.com/)
22 # See the file 'doc/LICENSE' for the license information
3 import json
34 import logging
45 import operator
56 import string
120121 cursor.close()
121122
122123 @event.listens_for(rv, "begin")
123 def do_begin(conn): # pylint:disable=unused-variable
124 def do_begin(conn): # pylint:disable=unused-variable
124125 # emit our own BEGIN
125126 conn.execute("BEGIN")
126127 return rv
168169 query = select([BooleanToIntColumn("(count(*) = 0)")])
169170 query = query.select_from(text('command_object as command_object_inner'))
170171 where_expr = " command_object_inner.create_date < command_object.create_date and " \
171 " (command_object_inner.object_id = command_object.object_id and " \
172 " command_object_inner.object_type = command_object.object_type) and " \
173 " command_object_inner.workspace_id = command_object.workspace_id "
172 " (command_object_inner.object_id = command_object.object_id and " \
173 " command_object_inner.object_type = command_object.object_type) and " \
174 " command_object_inner.workspace_id = command_object.workspace_id "
174175 query = query.where(text(where_expr))
175176 return column_property(
176177 query,
209210 # I suppose that we're using PostgreSQL, that can't compare
210211 # booleans with integers
211212 query = query.where(text("vulnerability.confirmed = true"))
212 elif confirmed == False:
213 elif confirmed is False:
213214 if db.session.bind.dialect.name == 'sqlite':
214215 # SQLite has no "true" expression, we have to use the integer 1
215216 # instead
304305
305306 vuln_count = (
306307 select([func.count(text('vulnerability.id'))]).
307 select_from(text('vulnerability')).
308 where(text(f'vulnerability.host_id = host.id and vulnerability.severity = \'{severity}\'')).
309 as_scalar()
308 select_from(text('vulnerability')).
309 where(text(f'vulnerability.host_id = host.id and vulnerability.severity = \'{severity}\'')).
310 as_scalar()
310311 )
311312
312313 vuln_web_count = (
313314 select([func.count(text('vulnerability.id'))]).
314 select_from(text('vulnerability, service')).
315 where(text('(vulnerability.service_id = service.id and '
316 f'service.host_id = host.id) and vulnerability.severity = \'{severity}\'')).
317 as_scalar()
315 select_from(text('vulnerability, service')).
316 where(text('(vulnerability.service_id = service.id and '
317 f'service.host_id = host.id) and vulnerability.severity = \'{severity}\'')).
318 as_scalar()
318319 )
319320
320321 vulnerability_generic_count = column_property(
368369 function = BlankColumn(Text)
369370 module = BlankColumn(Text)
370371
372 # 1 workspace <--> N source_codes
373 # 1 to N (the FK is placed in the child) and bidirectional (backref)
371374 workspace_id = Column(Integer, ForeignKey('workspace.id'), index=True, nullable=False)
372375 workspace = relationship('Workspace', backref='source_codes')
373376
428431
429432 host_id = Column(Integer, ForeignKey('host.id'), index=True, nullable=False)
430433 host = relationship('Host', backref=backref("hostnames", cascade="all, delete-orphan"))
431 workspace_id = Column(Integer, ForeignKey('workspace.id'), index=True, nullable=False)
434
435 # 1 workspace <--> N hostnames
436 # 1 to N (the FK is placed in the child) and bidirectional (backref)
437 workspace_id = Column(Integer, ForeignKey('workspace.id', ondelete='CASCADE'), index=True, nullable=False)
432438 workspace = relationship(
433439 'Workspace',
434 backref='hostnames',
435 foreign_keys=[workspace_id]
436 )
440 foreign_keys=[workspace_id],
441 backref=backref('hostnames', cascade="all, delete-orphan", passive_deletes=True),
442 )
443
437444 __table_args__ = (
438445 UniqueConstraint(name, host_id, workspace_id, name='uix_hostname_host_workspace'),
439446 )
444451 @property
445452 def parent(self):
446453 return self.host
447
448454
449455
450456 class CustomFieldsSchema(db.Model):
566572 for new_value in self._create(value):
567573 self.col.add(new_value)
568574
575
569576 def _build_associationproxy_creator(model_class_name):
570577 def creator(name, vulnerability):
571578 """Get or create a reference/policyviolation with the
658665 command = relationship('Command', backref='command_objects')
659666 command_id = Column(Integer, ForeignKey('command.id'), index=True)
660667
668 # 1 workspace <--> N command_objects
669 # 1 to N (the FK is placed in the child) and bidirectional (backref)
661670 workspace_id = Column(Integer, ForeignKey('workspace.id'), index=True, nullable=False)
662671 workspace = relationship(
663672 'Workspace',
664673 foreign_keys=[workspace_id],
665 backref = backref('command_objects', cascade="all, delete-orphan")
674 backref=backref('command_objects', cascade="all, delete-orphan")
666675 )
667676
668677 create_date = Column(DateTime, default=datetime.utcnow)
703712
704713 # db.session.flush()
705714 assert object_.id is not None, "object must have an ID. Try " \
706 "flushing the session"
715 "flushing the session"
707716 kwargs['object_id'] = object_.id
708717 kwargs['object_type'] = object_type
709718 return super().__init__(**kwargs)
715724 where_conditions.append("command_object.workspace_id = command.workspace_id")
716725 return column_property(
717726 select([func.sum(CommandObject.created)]).
718 select_from(table('command_object')).
719 where(text(' and '.join(where_conditions)))
727 select_from(table('command_object')).
728 where(text(' and '.join(where_conditions)))
720729 )
721730
722731
734743 for attr, filter_value in join_filters.items():
735744 where_conditions.append(f"vulnerability.{attr} = {filter_value}")
736745 return column_property(
737 select([func.sum(CommandObject.created)]). \
738 select_from(table('command_object')). \
739 select_from(table('vulnerability')). \
740 where(text(' and '.join(where_conditions)))
746 select([func.sum(CommandObject.created)])
747 .select_from(table('command_object'))
748 .select_from(table('vulnerability'))
749 .where(text(' and '.join(where_conditions)))
741750 )
742751
743752
744753 class Command(Metadata):
745
746754 IMPORT_SOURCE = [
747 'report', # all the files the tools export and faraday imports it from the resports directory, gtk manual import or web import.
755 'report',
756 # all the files the tools export and faraday imports it from the resports directory, gtk manual import or web import.
748757 'shell', # command executed on the shell or webshell with hooks connected to faraday.
749758 'agent'
750759 ]
761770 user = BlankColumn(String(250)) # os username where the command was executed
762771 import_source = Column(Enum(*IMPORT_SOURCE, name='import_source_enum'))
763772
773 # 1 workspace <--> N commands
774 # 1 to N (the FK is placed in the child) and bidirectional (backref)
764775 workspace_id = Column(Integer, ForeignKey('workspace.id'), index=True, nullable=False)
765776 workspace = relationship(
766777 'Workspace',
770781
771782 sum_created_vulnerabilities = _make_created_objects_sum('vulnerability')
772783
773 sum_created_vulnerabilities_web = _make_created_objects_sum_joined('vulnerability', {'type': '\'vulnerability_web\''})
784 sum_created_vulnerabilities_web = _make_created_objects_sum_joined('vulnerability',
785 {'type': '\'vulnerability_web\''})
774786
775787 sum_created_hosts = _make_created_objects_sum('host')
776788
817829 cascade="all, delete-orphan"
818830 )
819831
820 workspace_id = Column(Integer, ForeignKey('workspace.id'), index=True,
821 nullable=False)
832 # 1 workspace <--> N hosts
833 # 1 to N (the FK is placed in the child) and bidirectional (backref)
834 workspace_id = Column(Integer, ForeignKey('workspace.id', ondelete='CASCADE'), index=True, nullable=False)
822835 workspace = relationship(
823836 'Workspace',
824837 foreign_keys=[workspace_id],
825 backref=backref("hosts", cascade="all, delete-orphan")
826 )
838 backref=backref("hosts", cascade="all, delete-orphan", passive_deletes=True)
839 )
827840
828841 open_service_count = _make_generic_count_property(
829842 'host', 'service', where=text("service.status = 'open'"))
831844
832845 __host_vulnerabilities = (
833846 select([func.count(text('vulnerability.id'))]).
834 select_from(text('vulnerability')).
835 where(text('vulnerability.host_id = host.id')).
836 as_scalar()
847 select_from(text('vulnerability')).
848 where(text('vulnerability.host_id = host.id')).
849 as_scalar()
837850 )
838851 __service_vulnerabilities = (
839852 select([func.count(text('vulnerability.id'))]).
840 select_from(text('vulnerability, service')).
841 where(text('vulnerability.service_id = service.id and '
842 'service.host_id = host.id')).
843 as_scalar()
853 select_from(text('vulnerability, service')).
854 where(text('vulnerability.service_id = service.id and '
855 'service.host_id = host.id')).
856 as_scalar()
844857 )
845858 vulnerability_count = column_property(
846859 # select(text('count(*)')).select_from(__host_vulnerabilities.subquery()),
880893 cls.vulnerability_informational_count,
881894 _make_vuln_count_property(
882895 type_=None,
883 confirmed = confirmed,
884 use_column_property = False,
885 extra_query = "vulnerability.severity='informational'",
886 get_hosts_vulns = True
896 confirmed=confirmed,
897 use_column_property=False,
898 extra_query="vulnerability.severity='informational'",
899 get_hosts_vulns=True
887900 )
888901 ),
889902 with_expression(
890903 cls.vulnerability_medium_count,
891904 _make_vuln_count_property(
892 type_ = None,
893 confirmed = confirmed,
894 use_column_property = False,
895 extra_query = "vulnerability.severity='medium'",
896 get_hosts_vulns = True
905 type_=None,
906 confirmed=confirmed,
907 use_column_property=False,
908 extra_query="vulnerability.severity='medium'",
909 get_hosts_vulns=True
897910 )
898911 ),
899912 with_expression(
900913 cls.vulnerability_high_count,
901914 _make_vuln_count_property(
902 type_ = None,
903 confirmed = confirmed,
904 use_column_property = False,
905 extra_query = "vulnerability.severity='high'",
906 get_hosts_vulns = True
915 type_=None,
916 confirmed=confirmed,
917 use_column_property=False,
918 extra_query="vulnerability.severity='high'",
919 get_hosts_vulns=True
907920 )
908921 ),
909922 with_expression(
910923 cls.vulnerability_critical_count,
911924 _make_vuln_count_property(
912 type_ = None,
913 confirmed = confirmed,
914 use_column_property = False,
915 extra_query = "vulnerability.severity='critical'",
916 get_hosts_vulns = True
925 type_=None,
926 confirmed=confirmed,
927 use_column_property=False,
928 extra_query="vulnerability.severity='critical'",
929 get_hosts_vulns=True
917930 )
918931 ),
919932 with_expression(
920933 cls.vulnerability_low_count,
921934 _make_vuln_count_property(
922 type_ = None,
923 confirmed = confirmed,
924 use_column_property = False,
925 extra_query = "vulnerability.severity='low'",
926 get_hosts_vulns = True
935 type_=None,
936 confirmed=confirmed,
937 use_column_property=False,
938 extra_query="vulnerability.severity='low'",
939 get_hosts_vulns=True
927940 )
928941 ),
929942 with_expression(
930943 cls.vulnerability_unclassified_count,
931944 _make_vuln_count_property(
932 type_ = None,
933 confirmed = confirmed,
934 use_column_property = False,
935 extra_query = "vulnerability.severity='unclassified'",
936 get_hosts_vulns = True
945 type_=None,
946 confirmed=confirmed,
947 use_column_property=False,
948 extra_query="vulnerability.severity='unclassified'",
949 get_hosts_vulns=True
937950 )
938951 ),
939952 with_expression(
940953 cls.vulnerability_total_count,
941954 _make_vuln_count_property(
942 type_ = None,
943 confirmed = confirmed,
944 use_column_property = False,
945 get_hosts_vulns = True
955 type_=None,
956 confirmed=confirmed,
957 use_column_property=False,
958 get_hosts_vulns=True
946959 )
947960 ),
948961 )
9891002 foreign_keys=[host_id],
9901003 )
9911004
992 workspace_id = Column(Integer, ForeignKey('workspace.id'), index=True, nullable=False)
1005 # 1 workspace <--> N services
1006 # 1 to N (the FK is placed in the child) and bidirectional (backref)
1007 workspace_id = Column(Integer, ForeignKey('workspace.id', ondelete='CASCADE'), index=True, nullable=False)
9931008 workspace = relationship(
9941009 'Workspace',
995 backref=backref('services', cascade="all, delete-orphan"),
996 foreign_keys=[workspace_id]
1010 foreign_keys=[workspace_id],
1011 backref=backref('services', cascade="all, delete-orphan", passive_deletes=True),
9971012 )
9981013
9991014 vulnerability_count = _make_generic_count_property('service',
10491064 website = BlankColumn(Text)
10501065 status_code = Column(Integer, nullable=True)
10511066
1052
10531067 vulnerability_duplicate_id = Column(
1054 Integer,
1055 ForeignKey('vulnerability.id'),
1056 index=True,
1057 nullable=True,
1058 )
1068 Integer,
1069 ForeignKey('vulnerability.id'),
1070 index=True,
1071 nullable=True,
1072 )
10591073 duplicate_childs = relationship("VulnerabilityGeneric", cascade="all, delete-orphan",
1060 backref=backref('vulnerability_duplicate', remote_side=[id])
1061 )
1074 backref=backref('vulnerability_duplicate', remote_side=[id])
1075 )
10621076
10631077 vulnerability_template_id = Column(
1064 Integer,
1065 ForeignKey('vulnerability_template.id'),
1066 index=True,
1067 nullable=True,
1068 )
1069
1070 vulnerability_template = relationship('VulnerabilityTemplate', backref=backref('duplicate_vulnerabilities', passive_deletes='all'))
1071
1072 workspace_id = Column(
1073 Integer,
1074 ForeignKey('workspace.id'),
1075 index=True,
1076 nullable=False,
1077 )
1078 workspace = relationship('Workspace', backref='vulnerabilities')
1078 Integer,
1079 ForeignKey('vulnerability_template.id'),
1080 index=True,
1081 nullable=True,
1082 )
1083
1084 vulnerability_template = relationship('VulnerabilityTemplate',
1085 backref=backref('duplicate_vulnerabilities', passive_deletes='all'))
1086
1087 # 1 workspace <--> N vulnerabilites
1088 # 1 to N (the FK is placed in the child) and bidirectional (backref)
1089 workspace_id = Column(Integer, ForeignKey('workspace.id', ondelete='CASCADE'), index=True, nullable=False)
1090 workspace = relationship(
1091 'Workspace',
1092 backref=backref('vulnerabilities', cascade="all, delete-orphan", passive_deletes=True)
1093 )
10791094
10801095 reference_instances = relationship(
10811096 "Reference",
11191134
11201135 creator_command_id = column_property(
11211136 select([CommandObject.command_id])
1122 .where(CommandObject.object_type == 'vulnerability')
1123 .where(text('command_object.object_id = vulnerability.id'))
1124 .where(CommandObject.workspace_id == workspace_id)
1125 .order_by(asc(CommandObject.create_date))
1126 .limit(1),
1137 .where(CommandObject.object_type == 'vulnerability')
1138 .where(text('command_object.object_id = vulnerability.id'))
1139 .where(CommandObject.workspace_id == workspace_id)
1140 .order_by(asc(CommandObject.create_date))
1141 .limit(1),
11271142 deferred=True)
11281143
11291144 creator_command_tool = column_property(
11301145 select([Command.tool])
1131 .select_from(join(Command, CommandObject,
1132 Command.id == CommandObject.command_id))
1133 .where(CommandObject.object_type == 'vulnerability')
1134 .where(text('command_object.object_id = vulnerability.id'))
1135 .where(CommandObject.workspace_id == workspace_id)
1136 .order_by(asc(CommandObject.create_date))
1137 .limit(1),
1146 .select_from(join(Command, CommandObject,
1147 Command.id == CommandObject.command_id))
1148 .where(CommandObject.object_type == 'vulnerability')
1149 .where(text('command_object.object_id = vulnerability.id'))
1150 .where(CommandObject.workspace_id == workspace_id)
1151 .order_by(asc(CommandObject.create_date))
1152 .limit(1),
11381153 deferred=True
11391154 )
11401155
11411156 _host_ip_query = (
11421157 select([Host.ip])
1143 .where(text('vulnerability.host_id = host.id'))
1158 .where(text('vulnerability.host_id = host.id'))
11441159 )
11451160 _service_ip_query = (
11461161 select([text('host_inner.ip')])
1147 .select_from(text('host as host_inner, service'))
1148 .where(text('vulnerability.service_id = service.id and '
1149 'host_inner.id = service.host_id'))
1162 .select_from(text('host as host_inner, service'))
1163 .where(text('vulnerability.service_id = service.id and '
1164 'host_inner.id = service.host_id'))
11501165 )
11511166 target_host_ip = column_property(
11521167 case([
11531168 (text('vulnerability.host_id IS NOT null'),
1154 _host_ip_query.as_scalar()),
1169 _host_ip_query.as_scalar()),
11551170 (text('vulnerability.service_id IS NOT null'),
1156 _service_ip_query.as_scalar())
1171 _service_ip_query.as_scalar())
11571172 ]),
11581173 deferred=True
11591174 )
11601175
11611176 _host_os_query = (
11621177 select([Host.os])
1163 .where(text('vulnerability.host_id = host.id'))
1178 .where(text('vulnerability.host_id = host.id'))
11641179 )
11651180 _service_os_query = (
11661181 select([text('host_inner.os')])
1167 .select_from(text('host as host_inner, service'))
1168 .where(text('vulnerability.service_id = service.id and '
1169 'host_inner.id = service.host_id'))
1182 .select_from(text('host as host_inner, service'))
1183 .where(text('vulnerability.service_id = service.id and '
1184 'host_inner.id = service.host_id'))
11701185 )
11711186
11721187 host_id = Column(Integer, ForeignKey(Host.id), index=True)
11791194 target_host_os = column_property(
11801195 case([
11811196 (text('vulnerability.host_id IS NOT null'),
1182 _host_os_query.as_scalar()),
1197 _host_os_query.as_scalar()),
11831198 (text('vulnerability.service_id IS NOT null'),
1184 _service_os_query.as_scalar())
1199 _service_os_query.as_scalar())
11851200 ]),
11861201 deferred=True
11871202 )
12031218
12041219 @property
12051220 def has_duplicate(self):
1206 return self.vulnerability_duplicate_id == None
1221 return self.vulnerability_duplicate_id is None
12071222
12081223 @property
12091224 def hostnames(self):
12291244 @declared_attr
12301245 def service(cls):
12311246 return relationship('Service', backref=backref("vulnerabilities", cascade="all, delete-orphan"))
1232
12331247
12341248 @property
12351249 def parent(self):
12511265 kwargs['response'] = ''.join([x for x in kwargs['response'] if x in string.printable])
12521266 super().__init__(*args, **kwargs)
12531267
1254
12551268 @declared_attr
12561269 def service_id(cls):
12571270 return VulnerabilityGeneric.__table__.c.get(
13191332 id = Column(Integer, primary_key=True)
13201333 name = NonBlankColumn(Text)
13211334
1322 workspace_id = Column(
1323 Integer,
1324 ForeignKey('workspace.id'),
1325 index=True,
1326 nullable=False
1327 )
1335 # 1 workspace <--> N references
1336 # 1 to N (the FK is placed in the child) and bidirectional (backref)
1337 workspace_id = Column(Integer, ForeignKey('workspace.id'), index=True, nullable=False)
13281338 workspace = relationship(
13291339 'Workspace',
1330 backref=backref("references",
1331 cascade="all, delete-orphan"),
13321340 foreign_keys=[workspace_id],
1341 backref=backref("references", cascade="all, delete-orphan"),
13331342 )
13341343
13351344 __table_args__ = (
13461355
13471356
13481357 class ReferenceVulnerabilityAssociation(db.Model):
1349
13501358 __tablename__ = 'reference_vulnerability_association'
13511359
13521360 vulnerability_id = Column(Integer, ForeignKey('vulnerability.id'), primary_key=True)
13641372
13651373
13661374 class PolicyViolationVulnerabilityAssociation(db.Model):
1367
13681375 __tablename__ = 'policy_violation_vulnerability_association'
13691376
13701377 vulnerability_id = Column(Integer, ForeignKey('vulnerability.id'), primary_key=True)
13711378 policy_violation_id = Column(Integer, ForeignKey('policy_violation.id'), primary_key=True)
13721379
1373 policy_violation = relationship("PolicyViolation", backref="policy_violation_associations", foreign_keys=[policy_violation_id])
1374 vulnerability = relationship("Vulnerability", backref=backref("policy_violationvulnerability_associations", cascade="all, delete-orphan"),
1375 foreign_keys=[vulnerability_id])
1380 policy_violation = relationship("PolicyViolation", backref=backref("policy_violation_associations", cascade="all, delete-orphan"), foreign_keys=[policy_violation_id])
1381 vulnerability = relationship("Vulnerability", backref=backref("policy_violation_vulnerability_associations", cascade="all, delete-orphan"), foreign_keys=[vulnerability_id])
13761382
13771383
13781384 class ReferenceTemplateVulnerabilityAssociation(db.Model):
1379
13801385 __tablename__ = 'reference_template_vulnerability_association'
13811386
13821387 vulnerability_id = Column(Integer, ForeignKey('vulnerability_template.id'), primary_key=True)
13891394 )
13901395 vulnerability = relationship(
13911396 "VulnerabilityTemplate",
1392 backref=backref('reference_template_vulnerability_associations',
1393 cascade="all, delete-orphan"),
1394 foreign_keys=[vulnerability_id]
1397 foreign_keys=[vulnerability_id],
1398 backref=backref('reference_template_vulnerability_associations', cascade="all, delete-orphan")
13951399 )
13961400
13971401
13981402 class PolicyViolationTemplateVulnerabilityAssociation(db.Model):
1399
14001403 __tablename__ = 'policy_violation_template_vulnerability_association'
14011404
14021405 vulnerability_id = Column(Integer, ForeignKey('vulnerability_template.id'), primary_key=True)
14031406 policy_violation_id = Column(Integer, ForeignKey('policy_violation_template.id'), primary_key=True)
14041407
1405 policy_violation = relationship("PolicyViolationTemplate", backref="policy_violation_template_associations", foreign_keys=[policy_violation_id])
1406 vulnerability = relationship("VulnerabilityTemplate", backref=backref("policy_violation_template_vulnerability_associations", cascade="all, delete-orphan"),
1407 foreign_keys=[vulnerability_id])
1408 policy_violation = relationship("PolicyViolationTemplate", backref=backref("policy_violation_template_associations", cascade="all, delete-orphan"), foreign_keys=[policy_violation_id])
1409 vulnerability = relationship("VulnerabilityTemplate", backref=backref("policy_violation_template_vulnerability_associations", cascade="all, delete-orphan"), foreign_keys=[vulnerability_id])
14081410
14091411
14101412 class PolicyViolationTemplate(Metadata):
14141416
14151417 __table_args__ = (
14161418 UniqueConstraint(
1417 'name',
1418 name='uix_policy_violation_template_name'),
1419 'name',
1420 name='uix_policy_violation_template_name'),
14191421 )
14201422
14211423 def __init__(self, name=None, **kwargs):
14281430 name = NonBlankColumn(Text)
14291431
14301432 workspace_id = Column(
1431 Integer,
1432 ForeignKey('workspace.id'),
1433 index=True,
1434 nullable=False
1435 )
1433 Integer,
1434 ForeignKey('workspace.id'),
1435 index=True,
1436 nullable=False
1437 )
14361438 workspace = relationship(
1437 'Workspace',
1438 backref=backref("policy_violations",
1439 cascade="all, delete-orphan"),
1440 foreign_keys=[workspace_id],
1441 )
1439 'Workspace',
1440 backref=backref("policy_violations",
1441 cascade="all, delete-orphan"),
1442 foreign_keys=[workspace_id],
1443 )
14421444
14431445 __table_args__ = (
14441446 UniqueConstraint(
1445 'name',
1446 'workspace_id',
1447 name='uix_policy_violation_template_name_vulnerability_workspace'),
1447 'name',
1448 'workspace_id',
1449 name='uix_policy_violation_template_name_vulnerability_workspace'),
14481450 )
14491451
14501452 def __init__(self, name=None, workspace_id=None, **kwargs):
14751477 'Service',
14761478 backref=backref('credentials', cascade="all, delete-orphan"),
14771479 foreign_keys=[service_id],
1478 )
1479
1480 workspace_id = Column(Integer, ForeignKey('workspace.id'), index=True, nullable=False)
1480 )
1481
1482 # 1 workspace <--> N credentials
1483 # 1 to N (the FK is placed in the child) and bidirectional (backref)
1484 workspace_id = Column(Integer, ForeignKey('workspace.id', ondelete='CASCADE'), index=True, nullable=False)
14811485 workspace = relationship(
14821486 'Workspace',
1483 backref=backref('credentials', cascade="all, delete-orphan"),
14841487 foreign_keys=[workspace_id],
1488 backref=backref('credentials', cascade="all, delete-orphan", passive_deletes=True),
14851489 )
14861490
14871491 _host_ip_query = (
14881492 select([Host.ip])
1489 .where(text('credential.host_id = host.id'))
1493 .where(text('credential.host_id = host.id'))
14901494 )
14911495
14921496 _service_ip_query = (
14931497 select([text('host_inner.ip || \'/\' || service.name')])
1494 .select_from(text('host as host_inner, service'))
1495 .where(text('credential.service_id = service.id and '
1496 'host_inner.id = service.host_id'))
1498 .select_from(text('host as host_inner, service'))
1499 .where(text('credential.service_id = service.id and '
1500 'host_inner.id = service.host_id'))
14971501 )
14981502
14991503 target_ip = column_property(
15001504 case([
15011505 (text('credential.host_id IS NOT null'),
1502 _host_ip_query.as_scalar()),
1506 _host_ip_query.as_scalar()),
15031507 (text('credential.service_id IS NOT null'),
1504 _service_ip_query.as_scalar())
1508 _service_ip_query.as_scalar())
15051509 ]),
15061510 deferred=True
15071511 )
1508
15091512
15101513 __table_args__ = (
15111514 CheckConstraint('(host_id IS NULL AND service_id IS NOT NULL) OR '
15121515 '(host_id IS NOT NULL AND service_id IS NULL)',
15131516 name='check_credential_host_service'),
15141517 UniqueConstraint(
1515 'username',
1516 'host_id',
1517 'service_id',
1518 'workspace_id',
1519 name='uix_credential_username_host_service_workspace'
1520 ),
1518 'username',
1519 'host_id',
1520 'service_id',
1521 'workspace_id',
1522 name='uix_credential_username_host_service_workspace'
1523 ),
15211524 )
15221525
15231526 @property
15251528 return self.host or self.service
15261529
15271530
1528
15291531 association_workspace_and_agents_table = Table(
1530 'association_workspace_and_agents_table',
1531 db.Model.metadata,
1532 Column('workspace_id', Integer, ForeignKey('workspace.id')),
1533 Column('agent_id', Integer, ForeignKey('agent.id'))
1534 )
1532 'association_workspace_and_agents_table',
1533 db.Model.metadata,
1534 Column('workspace_id', Integer, ForeignKey('workspace.id')),
1535 Column('agent_id', Integer, ForeignKey('agent.id'))
1536 )
15351537
15361538
15371539 class Workspace(Metadata):
17101712 name = NonBlankColumn(Text)
17111713
17121714 workspace_id = Column(
1713 Integer,
1714 ForeignKey('workspace.id'),
1715 index=True,
1716 nullable=False
1717 )
1715 Integer,
1716 ForeignKey('workspace.id'),
1717 index=True,
1718 nullable=False
1719 )
17181720
17191721 workspace = relationship(
17201722 'Workspace',
1721 backref=backref('scope', cascade="all, delete-orphan"),
1722 foreign_keys=[workspace_id],
1723 )
1723 backref=backref('scope', cascade="all, delete-orphan"),
1724 foreign_keys=[workspace_id],
1725 )
17241726
17251727 __table_args__ = (
17261728 UniqueConstraint('name', 'workspace_id',
17531755
17541756
17551757 class User(db.Model, UserMixin):
1756
17571758 __tablename__ = 'faraday_user'
17581759 ROLES = ['admin', 'pentester', 'client', 'asset_owner']
17591760 OTP_STATES = ["disabled", "requested", "confirmed"]
17741775 role = Column(Enum(*ROLES, name='user_roles'),
17751776 nullable=False, default='client')
17761777 _otp_secret = Column(
1777 String(32),
1778 name="otp_secret", nullable=True)
1778 String(32),
1779 name="otp_secret", nullable=True
1780 )
17791781 state_otp = Column(Enum(*OTP_STATES, name='user_otp_states'), nullable=False, default="disabled")
17801782 preferences = Column(JSONType, nullable=True, default={})
1783 fs_uniquifier = Column(String(64), unique=True, nullable=False) # flask-security
17811784
17821785 # TODO: add many to many relationship to add permission to workspace
17831786
18491852 backref=backref('methodologies')
18501853 )
18511854 template_id = Column(
1852 Integer,
1853 ForeignKey('methodology_template.id',
1854 ondelete="SET NULL"),
1855 index=True,
1856 nullable=True,
1857 )
1858
1855 Integer,
1856 ForeignKey('methodology_template.id',
1857 ondelete="SET NULL"),
1858 index=True,
1859 nullable=True,
1860 )
1861
1862 # 1 workspace <--> N methodologies
1863 # 1 to N (the FK is placed in the child) and bidirectional (backref)
1864 workspace_id = Column(Integer, ForeignKey('workspace.id'), index=True, nullable=False)
18591865 workspace = relationship(
18601866 'Workspace',
18611867 backref=backref('methodologies', cascade="all, delete-orphan"),
18621868 )
1863 workspace_id = Column(Integer, ForeignKey('workspace.id'), index=True, nullable=False)
18641869
18651870 @property
18661871 def parent(self):
18861891 'MethodologyTemplate',
18871892 backref=backref('tasks', cascade="all, delete-orphan"))
18881893 template_id = Column(
1889 Integer,
1890 ForeignKey('methodology_template.id'),
1891 index=True,
1892 nullable=False,
1893 )
1894 Integer,
1895 ForeignKey('methodology_template.id'),
1896 index=True,
1897 nullable=False,
1898 )
18941899
18951900 # __table_args__ = (
18961901 # UniqueConstraint(template_id, name='uix_task_template_name_desc_template_delete'),
19341939 secondary="task_assigned_to_association")
19351940
19361941 methodology_id = Column(
1937 Integer,
1938 ForeignKey('methodology.id'),
1939 index=True,
1940 nullable=False,
1941 )
1942 Integer,
1943 ForeignKey('methodology.id'),
1944 index=True,
1945 nullable=False,
1946 )
19421947 methodology = relationship(
19431948 'Methodology',
19441949 backref=backref('tasks', cascade="all, delete-orphan")
19451950 )
19461951
19471952 template_id = Column(
1948 Integer,
1949 ForeignKey('task_template.id'),
1950 index=True,
1951 nullable=True,
1952 )
1953 Integer,
1954 ForeignKey('task_template.id'),
1955 index=True,
1956 nullable=True,
1957 )
19531958 template = relationship('TaskTemplate', backref='tasks')
19541959
1960 # 1 workspace <--> N tasks
1961 # 1 to N (the FK is placed in the child) and bidirectional (backref)
1962 workspace_id = Column(Integer, ForeignKey('workspace.id'), index=True, nullable=False)
19551963 workspace = relationship(
19561964 'Workspace',
19571965 backref=backref('tasks', cascade="all, delete-orphan")
19581966 )
1959 workspace_id = Column(Integer, ForeignKey('workspace.id'), index=True, nullable=False)
19601967
19611968 # __table_args__ = (
19621969 # UniqueConstraint(TaskABC.name, methodology_id, workspace_id, name='uix_task_name_desc_methodology_workspace'),
20142021 foreign_keys=[reply_to_id]
20152022 )
20162023
2017 workspace_id = Column(Integer, ForeignKey('workspace.id'), index=True,
2018 nullable=False)
2024 # 1 workspace <--> N comments
2025 # 1 to N (the FK is placed in the child) and bidirectional (backref)
2026 workspace_id = Column(Integer, ForeignKey('workspace.id'), index=True, nullable=False)
20192027 workspace = relationship(
20202028 'Workspace',
20212029 foreign_keys=[workspace_id],
20712079 collection_class=set,
20722080 )
20732081 filter = Column(JSONType, nullable=True, default=[])
2082
20742083 @property
20752084 def parent(self):
20762085 return
20842093
20852094
20862095 class Notification(db.Model):
2087
20882096 __tablename__ = 'notification'
20892097 id = Column(Integer, primary_key=True)
20902098
20922100 user_notified = relationship(
20932101 'User',
20942102 backref=backref('notification', cascade="all, delete-orphan"),
2095 #primaryjoin="User.id == Notification.user_notified_id"
2103 # primaryjoin="User.id == Notification.user_notified_id"
20962104 )
20972105
20982106 object_id = Column(Integer, nullable=False)
21032111 workspace = relationship(
21042112 'Workspace',
21052113 backref=backref('notification', cascade="all, delete-orphan"),
2106 #primaryjoin="Notification.id == Notification.workspace_id"
2114 # primaryjoin="Notification.id == Notification.workspace_id"
21072115 )
21082116
21092117 mark_read = Column(Boolean, default=False, index=True)
21182126 __tablename__ = 'knowledge_base'
21192127 id = Column(Integer, primary_key=True)
21202128
2121 vulnerability_template_id = Column(
2122 Integer,
2123 ForeignKey('vulnerability_template.id'),
2124 index=True,
2125 nullable=True,
2126 )
2129 vulnerability_template_id = Column(
2130 Integer,
2131 ForeignKey('vulnerability_template.id'),
2132 index=True,
2133 nullable=True,
2134 )
21272135 vulnerability_template = relationship('VulnerabilityTemplate',
2128 backref=backref('knowledge', cascade="all, delete-orphan"),
2129 )
2136 backref=backref('knowledge', cascade="all, delete-orphan"),
2137 )
21302138
21312139 faraday_kb_id = Column(Text, nullable=False)
21322140 reference_id = Column(Integer, nullable=False)
21362144 false_positive = Column(Integer, nullable=False, default=0)
21372145 verified = Column(Integer, nullable=False, default=0)
21382146
2139 __table_args__ = (UniqueConstraint('external_identifier', 'tool_name', 'reference_id', name='uix_externalidentifier_toolname_referenceid'),)
2147 __table_args__ = (UniqueConstraint('external_identifier', 'tool_name', 'reference_id',
2148 name='uix_externalidentifier_toolname_referenceid'),)
2149
2150
2151 def rule_default_name(context):
2152 model = context.get_current_parameters()['model']
2153 create_date = context.get_current_parameters()['create_date']
2154 return f'Rule for model {model} @ {create_date.isoformat()}'
21402155
21412156
21422157 class Rule(Metadata):
21432158 __tablename__ = 'rule'
21442159 id = Column(Integer, primary_key=True)
2160 description = Column(String, nullable=False, default="")
21452161 model = Column(String, nullable=False)
21462162 object_parent = Column(String, nullable=True)
21472163 fields = Column(JSONType, nullable=True)
2148 object = Column(JSONType, nullable=False)
21492164 enabled = Column(Boolean, nullable=False, default=True)
2150 actions = relationship("Action", secondary="rule_action", backref=backref("rules"))
2165 actions = relationship("Action", secondary="rule_action", backref=backref("rules"), lazy='subquery')
2166 # 1 workspace <--> N rules
2167 # 1 to N (the FK is placed in the child) and bidirectional (backref)
21512168 workspace_id = Column(Integer, ForeignKey('workspace.id'), index=True, nullable=False)
2152 workspace = relationship('Workspace', backref=backref('rules', cascade="all, delete-orphan"))
2169 workspace = relationship(
2170 'Workspace',
2171 backref=backref('rules', cascade="all, delete-orphan")
2172 )
2173 conditions = relationship("Condition", back_populates="rule",
2174 cascade="all, delete-orphan", passive_deletes=True, lazy='subquery')
2175 name = Column(String, nullable=False, unique=True, default=rule_default_name)
21532176
21542177 @property
21552178 def parent(self):
21562179 return
2180
2181 @property
2182 def object(self):
2183 # TODO THIS MUST BE DELETED AND REIMPLEMENTED FOR NEWW METHODS
2184 return json.dumps(
2185 [{condition.field: condition.value} for condition in self.conditions]
2186 )
21572187
21582188 @property
21592189 def disabled(self):
21682198 __tablename__ = 'action'
21692199 id = Column(Integer, primary_key=True)
21702200 name = Column(String, nullable=True)
2201 description = Column(String, nullable=False, default='')
21712202 command = Column(String, nullable=False)
21722203 field = Column(String, nullable=True)
21732204 value = Column(String, nullable=True)
22012232 active = Column(Boolean, nullable=False, default=True)
22022233 last_run = Column(DateTime)
22032234
2235 # 1 workspace <--> N schedules
2236 # 1 to N (the FK is placed in the child) and bidirectional (backref)
22042237 workspace_id = Column(Integer, ForeignKey('workspace.id'), index=True, nullable=False)
22052238 workspace = relationship(
22062239 'Workspace',
22292262 rule_id = Column(Integer, ForeignKey('rule.id'), index=True, nullable=False)
22302263 rule = relationship('Rule', foreign_keys=[rule_id], backref=backref('rule_actions', cascade="all, delete-orphan"))
22312264 action_id = Column(Integer, ForeignKey('action.id'), index=True, nullable=False)
2232 action = relationship('Action', foreign_keys=[action_id], backref=backref('rule_actions', cascade="all, delete-orphan"))
2265 action = relationship('Action', foreign_keys=[action_id],
2266 backref=backref('rule_actions', cascade="all, delete-orphan"))
2267
2268 __table_args__ = (UniqueConstraint('rule_id', 'action_id', name='rule_action_uc'),)
22332269
22342270
22352271 class Agent(Metadata):
22362272 __tablename__ = 'agent'
22372273 id = Column(Integer, primary_key=True)
22382274 token = Column(Text, unique=True, nullable=False, default=lambda:
2239 "".join([SystemRandom().choice(string.ascii_letters + string.digits)
2240 for _ in range(64)]))
2275 "".join([SystemRandom().choice(string.ascii_letters + string.digits)
2276 for _ in range(64)]))
22412277 workspaces = relationship(
22422278 'Workspace',
22432279 secondary=association_workspace_and_agents_table,
22522288
22532289 @property
22542290 def is_online(self):
2255 from faraday.server.websocket_factories import connected_agents # pylint:disable=import-outside-toplevel
2291 from faraday.server.websocket_factories import connected_agents # pylint:disable=import-outside-toplevel
22562292 return self.id in connected_agents
22572293
22582294 @property
22852321 message = Column(String, nullable=True)
22862322 executor_id = Column(Integer, ForeignKey('executor.id'), index=True, nullable=False)
22872323 executor = relationship('Executor', foreign_keys=[executor_id], backref=backref('executions', cascade="all, delete-orphan"))
2324 # 1 workspace <--> N agent_executions
2325 # 1 to N (the FK is placed in the child) and bidirectional (backref)
22882326 workspace_id = Column(Integer, ForeignKey('workspace.id'), index=True, nullable=False)
22892327 workspace = relationship(
22902328 'Workspace',
2291 backref=backref('agent_executions', cascade="all, delete-orphan"),
2329 backref=backref('agent_executions', cascade="all, delete-orphan")
22922330 )
22932331 parameters_data = Column(JSONType, nullable=False)
22942332 command_id = Column(Integer, ForeignKey('command.id'), index=True)
22982336 backref=backref('agent_execution_id', cascade="all, delete-orphan")
22992337 )
23002338
2301
23022339 @property
23032340 def parent(self):
23042341 return
23112348 field = Column(String)
23122349 value = Column(String)
23132350 operator = Column(String, default='equals')
2314 rule_id = Column(Integer, ForeignKey('rule.id'), index=True, nullable=False)
2315 rule = relationship('Rule', foreign_keys=[rule_id], backref=backref('conditions', cascade="all, delete-orphan"))
2351 # 1 rule <--> N conditions
2352 # 1 to N (the FK is placed in the child) and bidirectional (backref)
2353 # rule_id = Column(Integer, ForeignKey('rule.id'), index=True, nullable=False)
2354 # rule = relationship('Rule', foreign_keys=[rule_id], backref=backref('conditions', cascade="all, delete-orphan"))
2355 rule_id = Column(Integer, ForeignKey('rule.id', ondelete="CASCADE"), index=True, nullable=False)
2356 rule = relationship('Rule', back_populates="conditions")
23162357
23172358 @property
23182359 def parent(self):
23322373 rule_id = Column(Integer, ForeignKey('rule.id'), index=True, nullable=False)
23332374 rule = relationship('Rule', foreign_keys=[rule_id], backref=backref('executions', cascade="all, delete-orphan"))
23342375 command_id = Column(Integer, ForeignKey('command.id'), index=True, nullable=False)
2335 command = relationship('Command', foreign_keys=[command_id], backref=backref('rule_executions', cascade="all, delete-orphan"))
2376 command = relationship('Command', foreign_keys=[command_id],
2377 backref=backref('rule_executions', cascade="all, delete-orphan"))
23362378
23372379 @property
23382380 def parent(self):
23402382
23412383
23422384 class SearchFilter(Metadata):
2343
23442385 __tablename__ = 'search_filter'
23452386 id = Column(Integer, primary_key=True)
23462387 name = Column(String, nullable=False)
2347 json_query = Column(String, nullable=False) # meant to store json but just readonly
2388 json_query = Column(String, nullable=False) # meant to store json but just readonly
23482389 user_query = Column(String, nullable=False)
23492390
23502391
23792420 "COALESCE(website, ''), workspace_id, COALESCE(source_code_id, -1));"
23802421 )
23812422
2382
23832423 event.listen(
23842424 VulnerabilityGeneric.__table__,
23852425 'after_create',
23932433 )
23942434
23952435 # We have to import this after all models are defined
2396 import faraday.server.events # pylint: disable=unused-import
2436 import faraday.server.events # noqa F401
3232
3333 def _deserialize(self, value, attr, data, **kwargs):
3434 if value is not None and value:
35 return datetime.datetime.fromtimestamp(self._validated(value)/1e3)
35 return datetime.datetime.fromtimestamp(self._validated(value) / 1e3)
3636
3737
3838 class FaradayCustomField(fields.Field):
183183
184184 return self.write_field._deserialize(value, attr, data, **kwargs)
185185
186 def _add_to_schema(self, field_name, schema):
186 def _bind_to_schema(self, field_name, schema):
187187 # Propagate to child fields
188 super()._add_to_schema(field_name, schema)
189 self.read_field._add_to_schema(field_name, schema)
190 self.write_field._add_to_schema(field_name, schema)
188 super()._bind_to_schema(field_name, schema)
189 self.read_field._bind_to_schema(field_name, schema)
190 self.write_field._bind_to_schema(field_name, schema)
191191
192192
193193 class SeverityField(fields.String):
3636 report_json: dict,
3737 user_id: int):
3838 logger.info("Send Report data to workspace [%s]", workspace_name)
39 from faraday.server.web import app # pylint:disable=import-outside-toplevel
40 with app.app_context():
39 from faraday.server.web import get_app # pylint:disable=import-outside-toplevel
40 with get_app().app_context():
4141 ws = Workspace.query.filter_by(name=workspace_name).one()
4242 command = Command.query.filter_by(id=command_id).one()
4343 user = User.query.filter_by(id=user_id).one()
00 # Faraday Penetration Test IDE
11 # Copyright (C) 2016 Infobyte LLC (http://www.infobytesec.com/)
22 # See the file 'doc/LICENSE' for the license information
3
4 # I'm Py3
9595 # based systems). This second fork guarantees that the child is no
9696 # longer a session leader, preventing the daemon from ever acquiring
9797 # a controlling terminal.
98 pid = os.fork() # Fork a second child.
98 pid = os.fork() # Fork a second child.
9999 except OSError as e:
100100 raise Exception("%s [%d]" % (e.strerror, e.errno))
101101
109109 os.umask(UMASK)
110110 else:
111111 # exit() or _exit()? See below.
112 os._exit(0) # Exit parent (the first child) of the second child.
112 os._exit(0) # Exit parent (the first child) of the second child.
113113 else:
114114 # exit() or _exit()?
115115 # _exit is like exit(), but it doesn't call any functions registered
118118 # streams to be flushed twice and any temporary files may be unexpectedly
119119 # removed. It's therefore recommended that child branches of a fork()
120120 # and the parent branch(es) of a daemon use _exit().
121 os._exit(0) # Exit parent of the first child.
121 os._exit(0) # Exit parent of the first child.
122122
123123 # NOTE(mrocha): Since we need all file descriptors opened during server
124124 # setup (i.e.: databases sessions, logging, socket connections, etc.), we
158158 logger.info("Faraday Server stopped successfully")
159159 except OSError as err:
160160 if err.errno == errno.EPERM:
161 logger.error("Couldn't stop Faraday Server. User doesn't"\
162 "have enough permissions")
161 logger.error("Couldn't stop Faraday Server. User doesn't"
162 "have enough permissions")
163163 return False
164164 else:
165165 raise err
181181 remove_pid_file(port)
182182 return None
183183 elif err.errno == errno.EPERM:
184 logger.warning("Server is running BUT the current user"\
185 "doesn't have enough access to operate with it")
184 logger.warning("Server is running BUT the current user"
185 "doesn't have enough access to operate with it")
186186 return pid
187187 else:
188188 raise
189189 else:
190190 return pid
191
191192
192193 def get_server_pid(port):
193194 if not Path(str(FARADAY_SERVER_PID_FILE).format(port)).exists():
199200 try:
200201 pid = int(pid_file.readline())
201202 except ValueError:
202 logger.warning('PID file was found but is corrupted. '\
203 'Assuming server is not running. Please check manually'\
204 'if Faraday Server is effectively running')
203 logger.warning('PID file was found but is corrupted. '
204 'Assuming server is not running. Please check manually'
205 'if Faraday Server is effectively running')
205206 remove_pid_file(port)
206207 return None
207208
155155 else:
156156 count_filter = [func.count(distinct(count_col))]
157157
158 count_q = query.statement.with_only_columns(count_filter).\
159 order_by(None).group_by(None)
158 count_q = query.statement.with_only_columns(count_filter). \
159 order_by(None).group_by(None)
160160 count = query.session.execute(count_q).scalar()
161161
162162 return count
213213 object_type = instance.__tablename__
214214 if object_type is None:
215215 if instance.__class__.__name__ in ['Vulnerability',
216 'VulnerabilityWeb',
217 'VulnerabilityCode']:
216 'VulnerabilityWeb',
217 'VulnerabilityCode']:
218218 object_type = 'vulnerability'
219219 else:
220220 raise RuntimeError(f"Unknown table for object: {instance}")
263263
264264 if get_object_type_for(obj) == 'vulnerability':
265265 # This is a special key due to model inheritance
266 from faraday.server.models import VulnerabilityGeneric # pylint:disable=import-outside-toplevel
266 from faraday.server.models import VulnerabilityGeneric # pylint:disable=import-outside-toplevel
267267 klass = VulnerabilityGeneric
268268 else:
269269 klass = obj.__class__
310310
311311
312312 def is_unique_constraint_violation(exception):
313 from faraday.server.models import db # pylint:disable=import-outside-toplevel
313 from faraday.server.models import db # pylint:disable=import-outside-toplevel
314314 if db.engine.dialect.name != 'postgresql':
315315 # Not implemened for RDMS other than postgres, we can live without
316316 # this since it is just an extra check
99
1010
1111 debug_logger = logging.getLogger(__name__)
12
1213
1314 class Timer:
1415 def __init__(self, tag, logger=None):
2728 #
2829 # Debug utility extracted from http://docs.sqlalchemy.org/en/latest/faq/performance.html
2930 #
31
32
3033 @contextlib.contextmanager
3134 def profiled():
3235 pr = cProfile.Profile()
0 import re
10 import csv
21 from io import StringIO, BytesIO
32 import logging
203202
204203 # Patch possible formula injection attacks
205204 def csv_escape(vuln_dict):
206 for key,value in vuln_dict.items():
205 for key, value in vuln_dict.items():
207206 if str(value).startswith('=') or str(value).startswith('+') or str(value).startswith('-') or str(value).startswith('@'):
208207 # Convert value to str just in case is has another type (like a list or
209208 # dict). This would be done anyway by the csv writer.
2626 VALID_OPERATORS = set(OPERATORS.keys()) - set(['desc', 'asc'])
2727
2828 logger = logging.getLogger(__name__)
29
2930
3031 class FlaskRestlessFilterSchema(Schema):
3132 name = fields.String(required=True)
174175 def _model_class(self):
175176 return VulnerabilityWeb
176177
178
177179 class FlaskRestlessVulnerabilityTemplateFilterSchema(FlaskRestlessFilterSchema):
178180 def _model_class(self):
179181 return VulnerabilityTemplate
180182
183
181184 class FlaskRestlessHostFilterSchema(FlaskRestlessFilterSchema):
182185 def _model_class(self):
183186 return Host
184187
188
185189 class FlaskRestlessWorkspaceFilterSchema(FlaskRestlessFilterSchema):
186190 def _model_class(self):
187191 return Workspace
188192
193
189194 class FlaskRestlessUserFilterSchema(FlaskRestlessFilterSchema):
190195 def _model_class(self):
191196 return User
192
193197
194198
195199 class FlaskRestlessOperator(Schema):
33 See the file 'doc/LICENSE' for the license information
44
55 """
6
7
68 def remove_null_caracters(string):
79 string = string.replace('\x00', '')
810 string = string.replace('\00', '')
2020 from sqlalchemy import and_, or_
2121 from sqlalchemy import inspect as sqlalchemy_inspect
2222 from sqlalchemy.ext.associationproxy import AssociationProxy
23 from sqlalchemy.ext.hybrid import hybrid_property
2324 from sqlalchemy.orm.attributes import InstrumentedAttribute
2425 from sqlalchemy.orm.attributes import QueryableAttribute
2526 from sqlalchemy.orm import ColumnProperty
111112 #: be described by the strings ``'=='``, ``'eq'``, ``'equals'``, etc.
112113 OPERATORS = {
113114 # Operators which accept a single argument.
114 'is_null': lambda f: f == None,
115 'is_not_null': lambda f: f != None,
115 'is_null': lambda f: f is None,
116 'is_not_null': lambda f: f is not None,
116117 'desc': lambda f: f.desc,
117118 'asc': lambda f: f.asc,
118119 # Operators which accept two arguments.
279280 class JunctionFilter(Filter):
280281 def __init__(self, *subfilters):
281282 self.subfilters = subfilters
283
282284 def __iter__(self):
283285 return iter(self.subfilters)
284286
493495 create_filt = QueryBuilder._create_filter
494496
495497 def create_filters(filt):
496 if not getattr(filt, 'fieldname', False) or filt.fieldname.split('__')[0] in valid_model_fields:
498 if not getattr(filt, 'fieldname', False) \
499 or filt.fieldname.split('__')[0] in valid_model_fields:
497500 try:
498501 return create_filt(model, filt)
499 except AttributeError:
502 except AttributeError as e:
500503 # Can't create the filter since the model or submodel does not have the attribute (usually mapper)
501 return None
502 return None
504 raise AttributeError(f"Foreing field {filt.fieldname.split('__')[0]} not found in submodel")
505 raise AttributeError(f"Field {filt.fieldname} not found in model")
503506
504507 return create_filters
505508
541544 query = session.query(*select_fields)
542545 else:
543546 query = session.query(model)
547
544548 # This function call may raise an exception.
545 valid_model_fields = [str(algo).split('.')[1] for algo in sqlalchemy_inspect(model).attrs]
549 valid_model_fields = []
550 for orm_descriptor in sqlalchemy_inspect(model).all_orm_descriptors:
551 if isinstance(orm_descriptor, InstrumentedAttribute):
552 valid_model_fields.append(str(orm_descriptor).split('.')[1])
553 if isinstance(orm_descriptor, hybrid_property):
554 valid_model_fields.append(orm_descriptor.__name__)
546555
547556 filters_generator = map( # pylint: disable=W1636
548557 QueryBuilder.create_filters_func(model, valid_model_fields),
549558 search_params.filters
550559 )
560
551561 filters = [filt for filt in filters_generator if filt is not None]
552562
553563 # Multiple filter criteria at the top level of the provided search
4747
4848 response.direct_passthrough = False
4949
50 if (response.status_code < 200 or
51 response.status_code >= 300 or
52 'Content-Encoding' in response.headers):
50 if (response.status_code < 200
51 or response.status_code >= 300
52 or 'Content-Encoding' in response.headers):
5353 return response
5454 gzip_buffer = IO()
5555 gzip_file = gzip.GzipFile(mode='wb',
11 # Copyright (C) 2016 Infobyte LLC (http://www.infobytesec.com/)
22 # See the file 'doc/LICENSE' for the license information
33 import sys
4 import functools
54 import logging
65 from signal import SIGABRT, SIGILL, SIGINT, SIGSEGV, SIGTERM, SIG_DFL, signal
76
1918
2019 from flask_mail import Mail
2120
22 from OpenSSL.SSL import Error as SSLError
23
2421 import faraday.server.config
2522
2623 from faraday.server.config import CONST_FARADAY_HOME_PATH, smtp
27 from faraday.server.utils import logger
2824 from faraday.server.threads.reports_processor import ReportsManager, REPORTS_QUEUE
2925 from faraday.server.threads.ping_home import PingHomeThread
3026 from faraday.server.app import create_app
3329 BroadcastServerProtocol
3430 )
3531
32 FARADAY_APP = None
3633
37 app = create_app() # creates a Flask(__name__) app
38 # After 'Create app'
39 app.config['MAIL_SERVER'] = smtp.host
40 app.config['MAIL_PORT'] = smtp.port
41 app.config['MAIL_USE_SSL'] = smtp.ssl
42 app.config['MAIL_USERNAME'] = smtp.username
43 app.config['MAIL_PASSWORD'] = smtp.password
44 mail = Mail(app)
4534 logger = logging.getLogger(__name__)
4635
4736
8271 WEB_UI_LOCAL_PATH = faraday.server.config.FARADAY_BASE / 'server/www'
8372
8473 def __init__(self):
85 logger.info(f'Starting web server at http://'
74
75 logger.info('Starting web server at http://'
8676 f'{faraday.server.config.faraday_server.bind_address}:'
8777 f'{faraday.server.config.faraday_server.port}/')
8878 self.__websocket_port = faraday.server.config.faraday_server.websocket_port or 9000
9787 certs = (faraday.server.config.ssl.keyfile, faraday.server.config.ssl.certificate)
9888 if not all(certs):
9989 logger.critical("HTTPS request but SSL certificates are not configured")
100 sys.exit(1) # Abort web-server startup
90 sys.exit(1) # Abort web-server startup
10191 return ssl.DefaultOpenSSLContextFactory(*certs)
10292
10393 def __build_server_tree(self):
114104 return FileWithoutDirectoryListing(WebServer.WEB_UI_LOCAL_PATH)
115105
116106 def __build_api_resource(self):
117 return FaradayWSGIResource(reactor, reactor.getThreadPool(), app)
107 return FaradayWSGIResource(reactor, reactor.getThreadPool(), get_app())
118108
119109 def __build_websockets_resource(self):
120110 websocket_port = int(faraday.server.config.faraday_server.websocket_port)
180170 logger.exception(e)
181171 self.__stop_all_threads()
182172 sys.exit(1)
183 # I'm Py3
173
174
175 def get_app():
176 global FARADAY_APP # pylint: disable=W0603
177 if not FARADAY_APP:
178 app = create_app() # creates a Flask(__name__) app
179 # After 'Create app'
180 app.config['MAIL_SERVER'] = smtp.host
181 app.config['MAIL_PORT'] = smtp.port
182 app.config['MAIL_USE_SSL'] = smtp.ssl
183 app.config['MAIL_USERNAME'] = smtp.username
184 app.config['MAIL_PASSWORD'] = smtp.password
185 mail = Mail(app)
186 FARADAY_APP = app
187 return FARADAY_APP
2929 from faraday.server.api.modules.websocket_auth import decode_agent_websocket_token
3030 from faraday.server.events import changes_queue
3131
32
3332 logger = logging.getLogger(__name__)
34
3533
3634 connected_agents = {}
3735
5149 return (protocol, headers)
5250
5351 def onMessage(self, payload, is_binary):
54 from faraday.server.web import app # pylint:disable=import-outside-toplevel
5552 """
5653 We only support JOIN and LEAVE workspace messages.
5754 When authentication is implemented we need to verify
5956 When authentication is implemented we need to reply
6057 the client if the join failed.
6158 """
59 from faraday.server.web import get_app # pylint:disable=import-outside-toplevel
6260 if not is_binary:
6361 message = json.loads(payload)
6462 if message['action'] == 'JOIN_WORKSPACE':
6664 logger.warning(f'Invalid join workspace message: {message}')
6765 self.sendClose()
6866 return
69 signer = itsdangerous.TimestampSigner(app.config['SECRET_KEY'],
67 signer = itsdangerous.TimestampSigner(get_app().config['SECRET_KEY'],
7068 salt="websocket")
7169 try:
7270 workspace_id = signer.unsign(message['token'], max_age=60)
7674 '{}'.format(message['workspace']))
7775 logger.exception(e)
7876 else:
79 with app.app_context():
77 with get_app().app_context():
8078 workspace = Workspace.query.get(int(workspace_id))
8179 if workspace.name != message['workspace']:
8280 logger.warning(
9593 logger.warning("Invalid agent join message")
9694 self.sendClose(1000, reason="Invalid JOIN_AGENT message")
9795 return False
98 with app.app_context():
96 with get_app().app_context():
9997 try:
10098 agent = decode_agent_websocket_token(message['token'])
10199 update_executors(agent, message['executors'])
106104 # factory will now send broadcast messages to the agent
107105 return self.factory.join_agent(self, agent)
108106 if message['action'] == 'LEAVE_AGENT':
109 with app.app_context():
107 with get_app().app_context():
110108 (agent_id,) = [
111109 k
112110 for (k, v) in connected_agents.items()
116114 assert agent is not None # TODO the agent could be deleted here
117115 return self.factory.leave_agent(self, agent)
118116 if message['action'] == 'RUN_STATUS':
119 with app.app_context():
117 with get_app().app_context():
120118 if 'executor_name' not in message:
121119 logger.warning(f'Missing executor_name param in message: {message}')
122120 return True
150148 else:
151149 agent_execution.successful = message.get('successful', None)
152150 agent_execution.running = message.get('running', None)
153 agent_execution.message = message.get('message','')
151 agent_execution.message = message.get('message', '')
154152 db.session.commit()
155153 else:
156154 logger.exception(
1515 import faraday.server.web
1616 from faraday.server.models import db, Workspace
1717 from faraday.server.utils import daemonize
18 from faraday.server.web import app
18 from faraday.server.web import get_app
1919 from alembic.script import ScriptDirectory
2020 from alembic.config import Config
2121
4747
4848
4949 def check_postgresql():
50 with app.app_context():
50 with get_app().app_context():
5151 try:
5252 if not db.session.query(Workspace).count():
5353 logger.warning('No workspaces found')
7272 script = ScriptDirectory.from_config(config)
7373
7474 head_revision = script.get_current_head()
75 with app.app_context():
75 with get_app().app_context():
7676 try:
7777 conn = db.session.connection()
7878 except ImportError:
101101 "with a schema migration not merged yet. If you are a "
102102 "normal user, consider reporting this bug back to us"
103103 )
104
104105
105106 def main():
106107 os.chdir(faraday.server.config.FARADAY_BASE)
33 See the file 'doc/LICENSE' for the license information
44
55 """
6 # I'm Py3
3333 self.updateMetadata()
3434 return func(self, *args, **kwargs)
3535 return wrapper
36
37 # I'm Py3
8181
8282 logger = logging.getLogger(__name__)
8383
84
8485 class FaradayAPIPlugin(BasePlugin):
8586 """APISpec plugin for Flask"""
8687
124125 class_model = view_instance.model_class.__name__
125126 else:
126127 class_model = 'No name'
127 #print(f'{view_name} / {class_model}')
128 logger.debug(f'{view_name} / {class_model} / {rule.methods} / {view_name} / {view_instance._get_schema_class().__name__}')
128 # print(f'{view_name} / {class_model}')
129 logger.debug(
130 f'{view_name} / {class_model} / {rule.methods} / {view_name} / {view_instance._get_schema_class().__name__}')
129131 operations[view_name] = yaml_utils.load_yaml_from_docstring(
130 view.__doc__.format(schema_class=view_instance._get_schema_class().__name__, class_model=class_model, tag_name=class_model)
132 view.__doc__.format(schema_class=view_instance._get_schema_class().__name__,
133 class_model=class_model, tag_name=class_model)
131134 )
132135 elif hasattr(view, "__doc__"):
133136 if not view.__doc__:
137140 else:
138141 class_model = 'No name'
139142 for method in rule.methods:
140 logger.debug(f'{view_name} / {class_model} / {rule.methods} / {method} / {view_instance._get_schema_class().__name__}')
143 logger.debug(
144 f'{view_name} / {class_model} / {rule.methods} / {method} / {view_instance._get_schema_class().__name__}')
141145 if method not in ['HEAD', 'OPTIONS'] or os.environ.get("FULL_API_DOC", None):
142146 operations[method.lower()] = yaml_utils.load_yaml_from_docstring(
143 view.__doc__.format(schema_class=view_instance._get_schema_class().__name__, class_model=class_model, tag_name=class_model)
147 view.__doc__.format(schema_class=view_instance._get_schema_class().__name__,
148 class_model=class_model, tag_name=class_model)
144149 )
145150 if hasattr(view, "view_class") and issubclass(view.view_class, MethodView):
146151 for method in view.methods:
6666 ./packages/apispec-webframeworks
6767 { };
6868
69 bleach =
70 self.callPackage
71 ./packages/bleach
72 { };
73
6974 faraday-plugins =
7075 self.callPackage
7176 ./packages/faraday-plugins
1010 pname =
1111 "anyascii";
1212 version =
13 "0.1.7";
13 "0.2.0";
1414
1515 src =
1616 fetchPypi {
1818 pname
1919 version;
2020 sha256 =
21 "1xcrhmgpv8da34sg62r0yfxzyq2kwgiaardkih9z3sm96dlhgsyh";
21 "1b6jdd9nx15py0jqjdn154m6m491517sqlk57bbyj3x4xzywadkh";
2222 };
2323
2424 # TODO FIXME
0 # WARNING: This file was automatically generated. You should avoid editing it.
1 # If you run pynixify again, the file will be either overwritten or
2 # deleted, and you will lose the changes you made to it.
3
4 { buildPythonPackage
5 , fetchPypi
6 , lib
7 , packaging
8 , six
9 , webencodings
10 }:
11
12 buildPythonPackage rec {
13 pname =
14 "bleach";
15 version =
16 "3.3.0";
17
18 src =
19 fetchPypi {
20 inherit
21 pname
22 version;
23 sha256 =
24 "0cx4jyvd7hlaiiq2cq6vps689b978w3kyqqrvkckvs75743igcwq";
25 };
26
27 propagatedBuildInputs =
28 [
29 packaging
30 six
31 webencodings
32 ];
33
34 # TODO FIXME
35 doCheck =
36 false;
37
38 meta =
39 with lib; {
40 description =
41 "An easy safelist-based HTML-sanitizing tool.";
42 homepage =
43 "https://github.com/mozilla/bleach";
44 };
45 }
66 , apispec-webframeworks
77 , autobahn
88 , bcrypt
9 , bleach
910 , buildPythonPackage
1011 , click
1112 , colorama
2324 , flask-limiter
2425 , flask-security-too
2526 , flask_login
27 , flask_mail
2628 , flask_sqlalchemy
2729 , hypothesis
2830 , lib
6062 pname =
6163 "faradaysec";
6264 version =
63 "3.14.4";
65 "3.15.0";
6466
6567 src =
6668 lib.cleanSource
8587 wtforms
8688 flask_login
8789 flask-security-too
90 bleach
8891 marshmallow
8992 pillow
9093 psycopg2
112115 pyyaml
113116 pyotp
114117 flask-limiter
118 flask_mail
115119 ];
116120 checkInputs =
117121 [
11 # If you run pynixify again, the file will be either overwritten or
22 # deleted, and you will lose the changes you made to it.
33
4 { Babel
4 { blinker
55 , buildPythonPackage
66 , email_validator
77 , fetchPypi
88 , flask
9 , flask-babelex
109 , flask_login
11 , flask_mail
1210 , flask_principal
1311 , flask_wtf
1412 , itsdangerous
1513 , lib
1614 , passlib
17 , pytestrunner
18 , twine
19 , wheel
2015 }:
2116
2217 buildPythonPackage rec {
2318 pname =
2419 "flask-security-too";
2520 version =
26 "3.4.5";
21 "4.0.1";
2722
2823 src =
2924 fetchPypi {
3227 pname =
3328 "Flask-Security-Too";
3429 sha256 =
35 "19cdad65bxs23zz5hmr41s12359ija3p2kk0mbf9jsk1swg0b7d0";
30 "1q7izrmz84wwhmzs39zgjvr90vb22z3szsm8mp3a3qnb1377z5n2";
3631 };
3732
38 buildInputs =
39 [
40 Babel
41 pytestrunner
42 twine
43 wheel
44 ];
4533 propagatedBuildInputs =
4634 [
4735 flask
4836 flask_login
49 flask_mail
5037 flask_principal
5138 flask_wtf
52 flask-babelex
5339 email_validator
5440 itsdangerous
5541 passlib
42 blinker
5643 ];
5744
5845 # TODO FIXME
99 email_validator
1010 WTForms>=2.1
1111 flask-login>=0.5.0
12 Flask-Security-Too>=3.4.4,<4.0.0
12 Flask-Security-Too>=4.0.0
13 bleach>=3.3.0
1314 marshmallow>=3.0.0,<3.11.0
1415 Pillow>=4.2.1
1516 psycopg2
3738 pyyaml
3839 pyotp>=2.6.0
3940 Flask-Limiter
41 Flask-Mail
1010 '''
1111
1212 import os
13 import re
1413 import sys
1514 import subprocess
1615 import logging
1918 from tempfile import mkdtemp
2019 from shutil import rmtree
2120
22 VERSIONS = ['white', 'pink', 'black']
21 VERSIONS = ['white', 'black']
2322 BRANCH_FORMAT = 'origin/{}/dev'
23
2424
2525 @contextmanager
2626 def chdir(directory):
2929 os.chdir(directory)
3030 yield
3131 os.chdir(current)
32
3233
3334 @contextmanager
3435 def temp_worktree(branch=None):
4445 yield
4546 rmtree(directory)
4647 subprocess.check_output(['git', 'worktree', 'prune'])
48
4749
4850 def check_merge(dst_branch, cur_branch='HEAD'):
4951 """Return a boolean indicating if the merge from cur_branch
8486
8587 def version_of_branch(branch_name):
8688 """
87 >>> version_of_branch('tkt_white_this_is_not_a_pink_branch')
89 >>> version_of_branch('tkt_white_this_is_not_a_ee_branch')
8890 'white'
8991 """
9092 positions = {version: branch_name.find(version)
118120 else:
119121 branches_to_test.append(BRANCH_FORMAT.format(target_version))
120122
121 logging.info('Testing merges in branches %s' % branches_to_test)
123 logging.info(f'Testing merges in branches {branches_to_test}')
122124
123125 success = True
124126 cur_branch = branch
129131 else:
130132 success = False
131133 logger.error("Merge into %s failed :(", dst_branch)
132 print()
133 print()
134134
135135 if not success:
136136 sys.exit(1)
142142 parser.add_argument('-l', '--log-level', default='debug')
143143 args = parser.parse_args()
144144 main(args.branch)
145
146
147 # I'm Py3
148
11 # Check that a white branch doesn't contain commits of pink or black
22 # and a pink branch has no black commits
33 # Requires setting BRANCH_NAME environment variable
4 PINK_COMMIT=da7a869e186f61f1b138392734be4eae62cb2e31 # Always redirect to login page when user is logged out
5 BLACK_COMMIT=ec3dcfbe8955d41125944e82aa084b441c0b9e77 # Fix msg in webshell
4 PROF_COMMIT=da7a869e186f61f1b138392734be4eae62cb2e31 # Always redirect to login page when user is logged out
5 CORP_COMMIT=ec3dcfbe8955d41125944e82aa084b441c0b9e77 # Fix msg in webshell
66
77 if [ $CI_COMMIT_REF_NAME ]; then
88 BRANCH_NAME=$CI_COMMIT_REF_NAME
1010 BRANCH_NAME=$(git rev-parse --abbrev-ref HEAD)
1111 fi
1212
13 function fail(){
13 fail(){
1414 echo "Branch $BRANCH_NAME contains commit of another version ($1). You shouldn't do that!!!!!!"
1515 exit 1
1616 }
1717
18 function check_no_commits(){
18 check_no_commits(){
1919 # Check that current branch doesn't contain the commits passed as arguments
2020 # If it does contain at least one of then, quit the script with a non-zero exit code
2121 for commit in $*
2525 }
2626
2727 echo current branch $(git rev-parse --abbrev-ref HEAD) should be equal to $BRANCH_NAME
28 echo $BRANCH_NAME | grep -i white && check_no_commits $PINK_COMMIT $BLACK_COMMIT
29 echo $BRANCH_NAME | grep -i pink && check_no_commits $BLACK_COMMIT
28 echo $BRANCH_NAME | grep -i white && check_no_commits $PROF_COMMIT $CORP_COMMIT
3029 exit 0
1616 if not args.local:
1717 BRANCH_NAME = f"origin/{BRANCH_NAME}"
1818
19 PINK_FILE = "faraday/server/api/modules/reports.py"
20 BLACK_FILE = "faraday/server/api/modules/jira.py"
19 PROF_FILE = "faraday/server/api/modules/reports.py"
20 CORP_FILE = "faraday/server/api/modules/integration_jira.py"
2121
2222 mode = args.mode
2323 if mode == "diff":
4141 print(f"Current branch {ACTUAL_BRANCH} should be equal to {BRANCH_NAME}")
4242 intersection = set()
4343 if "white" in BRANCH_NAME:
44 intersection = git_diff_intersection({PINK_FILE, BLACK_FILE})
45 elif "pink" in BRANCH_NAME:
46 intersection = git_diff_intersection({BLACK_FILE})
47 assert len(intersection) == 0, f"The {intersection} should not be in " \
48 f"{BRANCH_NAME}"
49 assert child.returncode == 0, (child.stdout, child.returncode)
44 intersection = git_diff_intersection({PROF_FILE, CORP_FILE})
45 assert len(intersection) == 0, f"The {intersection} should not be in" \
46 f" {BRANCH_NAME}"
47 assert child.returncode == 0, (child.stdout, child.returncode)
99 # The list of error ignored is ordered by priority/easiness of the fix
1010 # First to fix
1111
12 ## Logic improve
13 ### comparison to None should be 'if cond is None:'
14 E711
15 ### ambiguous variable name 'x'
16 E741
17 ### the backslash is redundant between brackets
18 E502
19 ### 'x' imported but unused
20 F401
21 ### comparison to False should be 'if cond is False:' or 'if not cond:'
22 E712
23 ### redefinition of unused 'logger' from line 26
24 F811
2512
2613 ## Invalid escape sequence; probably fixed by adding r to specify regex str
2714 ### invalid escape sequence
3219 F841
3320
3421 ## New lines
35 ### no newline at end of file
36 W292
37 ### Blank line at end of file
38 W391
39 ### expected 1 blank line, found 0
40 E302
41 ### expected 2 blank line, found 1
42 E301
43 ### line break before binary operator
22 ### line break before binary operator, W503 is deprecated
4423 W503
45 ### line break after binary operator
46 W504
47 ### expected 2 blank lines after class or function definition, found 1
48 E305
49 ### too many blank lines (N)
50 E303
51
52 ## Spaces
53 ### whitespace after '['
54 E201
55 ### whitespace before ']'
56 E202
57 ### missing whitespace after ','
58 E231
59 ### multiple spaces after operator
60 E222
61 ### missing whitespace around arithmetic operator
62 E226
63 ### unexpected spaces around keyword / parameter equals
64 E251
65 ### missing whitespace around operator
66 E225
67 ### blank line contains whitespace
68 W293
69 ### trailing whitespace
70 W291
71 ### multiple spaces after ','
72 E241
73
74 ## Block comment
75 ### at least two spaces before inline comment
76 E261
77 ### inline comment should start with '# '
78 E262
79 ### block comment should start with '# '
80 E265
81 ### E266 too many leading '#' for block comment
82 E266
8324
8425 ## Visual
8526 ### continuation line missing indentation or outdented
4141 # and we don't want this!
4242 # Taken from https://github.com/pypa/setuptools_scm/issues/190#issuecomment-351181286
4343 import setuptools_scm.integration
44
4445 setuptools_scm.integration.find_files = lambda _: []
4546 except ImportError:
4647 pass
162163 # packages=find_packages(exclude=['contrib', 'docs', 'tests']), # Required
163164 # packages=[''],
164165 # packages=['faraday', 'faraday.server', 'faraday.utils'],
165 #packages=['faraday.' + package
166 # packages=['faraday.' + package
166167 # for package in find_packages(
167168 # '.', include=['server.*', 'config.*', 'utils.*', 'client.*',
168169 # 'server', 'config', 'utils', 'client'])
169170 # ] + ['faraday'],
170 #package_dir={'faraday': '.'},
171 # package_dir={'faraday': '.'},
171172 packages=find_packages(include=['faraday', 'faraday.*']),
172173
173174 # Specify which Python versions you support. In contrast to the
205206 # MANIFEST.in as well.
206207 include_package_data=True,
207208 package_data={ # Optional
208 '': ['requirements.txt',],
209 '': ['requirements.txt', ],
209210 },
210211
211212 # Although 'package_data' is the preferred approach, in some case you may
242243 # what's used to render the link text on PyPI.
243244 project_urls={ # Optional
244245 'Bug Reports': 'https://github.com/infobyte/faraday/issues',
245 #'Funding': 'https://donate.pypi.org',
246 # 'Funding': 'https://donate.pypi.org',
246247 'Say Thanks!': 'http://saythanks.io/to/faradaysec',
247248 'Source': 'https://github.com/infobyte/faraday/',
248249 },
33 See the file 'doc/LICENSE' for the license information
44
55 '''
6 # I'm Py3
1919 from faraday.server.app import create_app
2020 from faraday.server.models import db
2121 from tests import factories
22
2322
2423 TEST_DATA_PATH = Path(__file__).parent / 'data'
2524
5958 from flask import _app_ctx_stack
6059 _app_ctx_stack.top.sqlalchemy_queries = []
6160
62 ret = super(CustomClient, self).open(*args, **kwargs)
63 #Now set in flask 1.0
64 #if ret.headers.get('content-type') == 'application/json':
61 ret = super().open(*args, **kwargs)
62 # Now set in flask 1.0
63 # if ret.headers.get('content-type') == 'application/json':
6564 # try:
6665 # ret.json = json.loads(ret.data)
6766 # except ValueError:
7877 # we need to review sqlite configuraitons for persistence using PRAGMA.
7978 parser.addoption('--connection-string', default=f'sqlite:////{TEMPORATY_SQLITE.name}',
8079 help="Database connection string. Defaults to in-memory "
81 "sqlite if not specified:")
80 "sqlite if not specified:")
8281 parser.addoption('--ignore-nplusone', action='store_true',
8382 help="Globally ignore nplusone errors")
8483 parser.addoption("--with-hypothesis", action="store_true",
190189 @event.listens_for(session, "after_transaction_end")
191190 def restart_savepoint(session, transaction):
192191 if transaction.nested and not transaction._parent.nested:
193
194192 # ensure that state is expired the way
195193 # session.commit() at the top level normally does
196194 # (optional step)
219217
220218 @pytest.fixture
221219 def test_client(app):
222
223220 # flask.g is persisted in requests, and the werkzeug
224221 # CSRF checker could fail if we don't do this
225222 from flask import g
266263 # http://pythonhosted.org/Flask-Testing/#testing-with-sqlalchemy
267264 assert user.id is not None
268265 db.session.add(user)
269 sess['_user_id'] = user.id # TODO use public flask_login functions
266 sess['_user_id'] = user.fs_uniquifier # TODO use public flask_login functions
270267 identity_changed.send(test_client.application,
271268 identity=Identity(user.id))
272269
298295 session_response = test_client.get('/session')
299296 return session_response.json.get('csrf_token')
300297
301
302298 # I'm Py3
1111 import datetime
1212 import itertools
1313 import unicodedata
14 import uuid
1415 import time
1516
1617 import pytz
17 from factory import SubFactory
1818 from factory.fuzzy import (
1919 BaseFuzzyAttribute,
2020 FuzzyChoice,
5151 Executor,
5252 Rule,
5353 Action,
54 RuleAction)
54 RuleAction,
55 Condition)
56
5557
5658 # Make partials for start and end date. End date must be after start date
5759 def FuzzyStartTime():
6264 )
6365 )
6466
67
6568 def FuzzyEndTime():
6669 return (
6770 FuzzyNaiveDateTime(
7073 )
7174 )
7275
76
7377 all_unicode = ''.join(chr(i) for i in range(65536))
7478 UNICODE_LETTERS = ''.join(c for c in all_unicode if unicodedata.category(c) == 'Lu' or unicodedata.category(c) == 'Ll')
7579
9094 class UserFactory(FaradayFactory):
9195
9296 username = FuzzyText()
97 fs_uniquifier = factory.LazyAttribute(
98 lambda e: uuid.uuid4().hex
99 )
93100
94101 class Meta:
95102 model = User
98105
99106 class WorkspaceFactory(FaradayFactory):
100107
101 name = FuzzyText(chars=string.ascii_lowercase+string.digits)
108 name = FuzzyText(chars=string.ascii_lowercase + string.digits)
102109 creator = factory.SubFactory(UserFactory)
103110
104111 class Meta:
123130
124131 def __init__(self, low, high, **kwargs):
125132 self.iterator = itertools.cycle(range(low, high - 1))
126 super(FuzzyIncrementalInteger, self).__init__(**kwargs)
133 super().__init__(**kwargs)
127134
128135 def fuzz(self):
129136 return next(self.iterator)
189196
190197 @classmethod
191198 def build_dict(cls, **kwargs):
192 ret = super(ServiceFactory, cls).build_dict(**kwargs)
199 ret = super().build_dict(**kwargs)
193200 ret['host'].workspace = kwargs['workspace']
194201 ret['parent'] = ret['host'].id
195202 ret['ports'] = [ret['port']]
326333 service = factory.SubFactory(ServiceFactory, workspace=factory.SelfAttribute('..workspace'))
327334 type = "vulnerability_web"
328335
329
330 @classmethod
331 def build_dict(cls, **kwargs):
332 ret = super(VulnerabilityWebFactory, cls).build_dict(**kwargs)
336 @classmethod
337 def build_dict(cls, **kwargs):
338 ret = super().build_dict(**kwargs)
333339 assert ret['type'] == 'vulnerability_web'
334340 ret['type'] = 'VulnerabilityWeb'
335341 return ret
360366 model = VulnerabilityTemplate
361367 sqlalchemy_session = db.session
362368
363
364 @classmethod
365 def build_dict(cls, **kwargs):
366 ret = super(VulnerabilityTemplateFactory, cls).build_dict(**kwargs)
369 @classmethod
370 def build_dict(cls, **kwargs):
371 ret = super().build_dict(**kwargs)
367372 ret['exploitation'] = ret['severity']
368373 return ret
369374
428433 @classmethod
429434 def build_dict(cls, **kwargs):
430435 # Ugly hack to JSON-serialize datetimes
431 ret = super(CommandFactory, cls).build_dict(**kwargs)
436 ret = super().build_dict(**kwargs)
432437 ret['itime'] = time.mktime(ret['start_date'].utctimetuple())
433438 ret['duration'] = (ret['end_date'] - ret['start_date']).seconds + ((ret['end_date'] - ret['start_date']).microseconds / 1000000.0)
434439 ret.pop('start_date')
465470 @classmethod
466471 def build_dict(cls, **kwargs):
467472 # The host, service or comment must be created
468 ret = super(CommentFactory, cls).build_dict(**kwargs)
473 ret = super().build_dict(**kwargs)
469474 workspace = kwargs['workspace']
470475 if ret['object_type'] == 'host':
471476 HostFactory.create(workspace=workspace, id=ret['object_id'])
482487 sqlalchemy_session = db.session
483488
484489
485
486490 class LicenseFactory(FaradayFactory):
487491 product = FuzzyText()
488492 start_date = FuzzyStartTime()
496500 @classmethod
497501 def build_dict(cls, **kwargs):
498502 # Ugly hack to JSON-serialize datetimes
499 ret = super(LicenseFactory, cls).build_dict(**kwargs)
503 ret = super().build_dict(**kwargs)
500504 ret['start'] = ret['start_date'].isoformat()
501505 ret['end'] = ret['end_date'].isoformat()
502506 ret.pop('start_date')
546550
547551 @classmethod
548552 def build_dict(cls, **kwargs):
549 return super(AgentFactory, cls).build_dict(**kwargs)
553 return super().build_dict(**kwargs)
550554
551555 class Meta:
552556 model = Agent
559563 parameters_metadata = factory.LazyAttribute(
560564 lambda e: {"param_name": False}
561565 )
566
562567 class Meta:
563568 model = Executor
564569 sqlalchemy_session = db.session
585590 sqlalchemy_session = db.session
586591
587592
588
589593 class SearchFilterFactory(FaradayFactory):
590594
591595 name = FuzzyText()
610614 sqlalchemy_session = db.session
611615
612616
617 class ConditionFactory(FaradayFactory):
618 field = 'description'
619 value = FuzzyText()
620 operator = 'equals'
621
622 class Meta:
623 model = Condition
624 sqlalchemy_session = db.session
625
626
613627 class RuleFactory(WorkspaceObjectFactory):
614628 model = 'Vulnerability'
615 object = "severity=low",
616629 disabled = FuzzyChoice([True, False])
617630 workspace = factory.SubFactory(WorkspaceFactory)
618631
1616 TaskTemplate,
1717 WorkspacePermission,
1818 )
19
1920
2021 def test_delete_user(workspace, session):
2122 assert workspace.creator
243244 def test_delete_user_deletes_assignations(self):
244245 with self.assert_deletes(self.methodology_task_assigned):
245246 self.session.delete(self.user)
246 # I'm Py3
1313 from tests.test_api_workspaced_base import (
1414 ReadOnlyAPITests)
1515 from tests import factories
16 from tests.factories import WorkspaceFactory
16
1717
1818 @pytest.mark.parametrize(
1919 "with_host_vulns,with_service_vulns", [[True, False],
126126 assert len(host.hostnames) == 1
127127 assert host.hostnames[0].name == 'y'
128128
129
129130 HOST_TO_QUERY_AMOUNT = 3
130131 HOST_NOT_TO_QUERY_AMOUNT = 2
131132 SERVICE_BY_HOST = 3
132133 VULN_BY_HOST = 2
133134 VULN_BY_SERVICE = 1
135
134136
135137 class TestHostAPI(ReadOnlyAPITests):
136138 model = Host
191193
192194 # This test the api endpoint for some of the host in the ws, with existing other host in other ws and ask for the
193195 # other hosts and test the api endpoint for all of the host in the ws, retrieving all host when none is required
194 @pytest.mark.parametrize('querystring', [ 'countVulns/?hosts={}', 'countVulns/',
196 @pytest.mark.parametrize('querystring', ['countVulns/?hosts={}', 'countVulns/',
195197 ])
196198 def test_vuln_count_ignore_other_ws(self,
197199 vulnerability_factory,
241243
242244 for host in hosts_not_to_query_w2:
243245 assert str(host.id) not in res.json['hosts']
244 # I'm Py3
246 # I'm Py3
2828
2929 session.commit()
3030 assert vuln.tags == set(correct_tags)
31 # I'm Py3
120120 assert self.model.query.count() == 2
121121 assert len(self.childs(self.vuln_different_ws)) == 1
122122 new_child = self.childs(self.vuln_different_ws, True).pop()
123 assert (new_child.workspace_id ==
124 self.vuln_different_ws.workspace_id)
123 assert (new_child.workspace_id
124 == self.vuln_different_ws.workspace_id)
125125 assert new_child.id != child.id
126126
127127 def test_remove_reference(self, session, child):
8585 assert workspace['vulnerability_standard_count'] == sum(
8686 STANDARD_VULN_COUNT)
8787 assert workspace['vulnerability_total_count'] == (
88 sum(STANDARD_VULN_COUNT) + WEB_VULN_COUNT +
89 SOURCE_CODE_VULN_COUNT
88 sum(STANDARD_VULN_COUNT) + WEB_VULN_COUNT + SOURCE_CODE_VULN_COUNT
9089 )
9190
9291
102101 assert workspace['vulnerability_standard_count'] == sum(
103102 C_STANDARD_VULN_COUNT)
104103 assert workspace['vulnerability_total_count'] == (
105 sum(C_STANDARD_VULN_COUNT) + C_WEB_VULN_COUNT +
106 C_SOURCE_CODE_VULN_COUNT
104 sum(C_STANDARD_VULN_COUNT) + C_WEB_VULN_COUNT + C_SOURCE_CODE_VULN_COUNT
107105 )
108106
109107
118116 assert workspace.vulnerability_code_count is None
119117 assert workspace.vulnerability_standard_count is None
120118 assert workspace.vulnerability_total_count is None
121 # I'm Py3
33 See the file 'doc/LICENSE' for the license information
44
55 '''
6
7 # I'm Py3
3737 assert activities['hosts_count'] == 1
3838 assert activities['vulnerabilities_count'] == 1
3939 assert activities['tool'] == 'nessus'
40
4140
4241 def test_load_itime(self, test_client, session):
4342 ws = WorkspaceFactory.create(name="abc")
1212 from faraday.server.api.modules.agent import AgentWithWorkspacesView, AgentView
1313 from faraday.server.models import Agent, Command
1414 from tests.factories import AgentFactory, WorkspaceFactory, ExecutorFactory
15 from tests.test_api_non_workspaced_base import ReadWriteAPITests, OBJECT_COUNT, PatchableTestsMixin
16 from tests.test_api_workspaced_base import ReadWriteMultiWorkspacedAPITests, ReadOnlyMultiWorkspacedAPITests
15 from tests.test_api_non_workspaced_base import ReadWriteAPITests, PatchableTestsMixin
16 from tests.test_api_workspaced_base import ReadOnlyMultiWorkspacedAPITests
1717 from tests import factories
1818 from tests.test_api_workspaced_base import API_PREFIX
1919 from tests.utils.url import v2_to_v3
9797 session.commit()
9898 secret = pyotp.random_base32()
9999 faraday_server_config.agent_registration_secret = secret
100 faraday_server_config.agent_token_expiration = 60
100101 logout(test_client, [302])
101102 initial_agent_count = len(session.query(Agent).all())
102103 raw_data = get_raw_agent(
103104 name='new_agent',
104 token=pyotp.TOTP(secret).now(),
105 token=pyotp.TOTP(secret, interval=60).now(),
105106 workspaces=[workspace, other_workspace]
106107 )
107108 # /v2/agent_registration/
124125 session.commit()
125126 secret = pyotp.random_base32()
126127 faraday_server_config.agent_registration_secret = secret
128 faraday_server_config.agent_token_expiration = 60
127129 logout(test_client, [302])
128130 initial_agent_count = len(session.query(Agent).all())
129131 raw_data = get_raw_agent(
130132 name=None,
131 token=pyotp.TOTP(secret).now(),
133 token=pyotp.TOTP(secret, interval=60).now(),
132134 workspaces=[workspace]
133135 )
134136 # /v2/agent_registration/
191193 session.commit()
192194 secret = pyotp.random_base32()
193195 faraday_server_config.agent_registration_secret = secret
196 faraday_server_config.agent_token_expiration = 60
194197 logout(test_client, [302])
195198 raw_data = get_raw_agent(
196 token=pyotp.TOTP(secret).now(),
199 token=pyotp.TOTP(secret, interval=60).now(),
197200 name="test agent",
198201 workspaces=[]
199202 )
209212 session.commit()
210213 secret = pyotp.random_base32()
211214 faraday_server_config.agent_registration_secret = secret
215 faraday_server_config.agent_token_expiration = 60
212216 logout(test_client, [302])
213217 raw_data = get_raw_agent(
214 token=pyotp.TOTP(secret).now(),
218 token=pyotp.TOTP(secret, interval=60).now(),
215219 name="test agent",
216220 workspaces=[]
217221 )
228232 session.commit()
229233 secret = pyotp.random_base32()
230234 faraday_server_config.agent_registration_secret = secret
235 faraday_server_config.agent_token_expiration = 60
231236 logout(test_client, [302])
232237 raw_data = get_raw_agent(
233238 name="test agent",
234 token=pyotp.TOTP(secret).now()
239 token=pyotp.TOTP(secret, interval=60).now()
235240 )
236241 # /v2/agent_registration/
237242 res = test_client.post(self.check_url('/v2/agent_registration/'), data=raw_data)
252257
253258 def test_create_succeeds(self, test_client):
254259 with pytest.raises(AssertionError) as exc_info:
255 super(TestAgentWithWorkspacesAPIGeneric, self).test_create_succeeds(test_client)
260 super().test_create_succeeds(test_client)
256261 assert '405' in exc_info.value.args[0]
257262
258263 def test_create_fails_with_empty_dict(self, test_client):
259264 with pytest.raises(AssertionError) as exc_info:
260 super(TestAgentWithWorkspacesAPIGeneric, self).test_create_fails_with_empty_dict(test_client)
265 super().test_create_fails_with_empty_dict(test_client)
261266 assert '405' in exc_info.value.args[0]
262267
263 def workspaced_url(self, workspace, obj= None):
268 def workspaced_url(self, workspace, obj=None):
264269 url = API_PREFIX + workspace.name + '/' + self.api_endpoint + '/'
265270 if obj is not None:
266271 id_ = str(obj.id) if isinstance(obj, self.model) else str(obj)
391396 assert res.status_code == 204
392397 assert len(session.query(Agent).all()) == initial_agent_count
393398
394 def test_run_fails(self, test_client, session,csrf_token):
399 def test_run_fails(self, test_client, session, csrf_token):
395400 workspace = WorkspaceFactory.create()
396401 session.add(workspace)
397402 other_workspace = WorkspaceFactory.create()
422427
423428 class TestAgentWithWorkspacesAPIGenericV3(TestAgentWithWorkspacesAPIGeneric, PatchableTestsMixin):
424429 def url(self, obj=None):
425 return v2_to_v3(super(TestAgentWithWorkspacesAPIGenericV3, self).url(obj))
430 return v2_to_v3(super().url(obj))
426431
427432
428433 class TestAgentAPI(ReadOnlyMultiWorkspacedAPITests):
602607
603608 class TestAgentAPIV3(TestAgentAPI):
604609 def url(self, obj=None, workspace=None):
605 return v2_to_v3(super(TestAgentAPIV3, self).url(obj, workspace))
610 return v2_to_v3(super().url(obj, workspace))
606611
607612 def check_url(self, url):
608613 return v2_to_v3(url)
0 '''
1 Faraday Penetration Test IDE
2 Copyright (C) 2013 Infobyte LLC (http://www.infobytesec.com/)
3 See the file 'doc/LICENSE' for the license information
4
5 '''
6 from builtins import str
7 import base64
8
9 import pytest
10 from tests import factories
11 from flask_security.utils import hash_password
12 from faraday.server.api.modules.websocket_auth import decode_agent_websocket_token
13 from tests.utils.url import v2_to_v3
14
15
16 class TestWebsocketAuthEndpoint:
17 def check_url(self, url):
18 return url
19
20 def test_not_logged_in_request_fail(self, test_client, workspace):
21 res = test_client.post(self.check_url(f'/v2/ws/{workspace.name}/websocket_token/'))
22 assert res.status_code == 401
23
24 @pytest.mark.usefixtures('logged_user')
25 def test_get_method_succeeds(self, test_client, workspace):
26 res = test_client.get(self.check_url(f'/v2/ws/{workspace.name}/websocket_token/'))
27 assert res.status_code == 200
28
29 # A token for that workspace should be generated,
30 # This will break if we change the token generation
31 # mechanism.
32 assert res.json['token'].startswith(str(workspace.id))
33
34 @pytest.mark.usefixtures('logged_user')
35 def test_post_method_succeeds(self, test_client, workspace):
36 res = test_client.post(self.check_url(f'/v2/ws/{workspace.name}/websocket_token/'))
37 assert res.status_code == 200
38
39 # A token for that workspace should be generated,
40 # This will break if we change the token generation
41 # mechanism.
42 assert res.json['token'].startswith(str(workspace.id))
43
44
45 class TestWebsocketAuthEndpointV3(TestWebsocketAuthEndpoint):
46 def check_url(self, url):
47 return v2_to_v3(url)
48
49
50 class TestAgentWebsocketToken:
51
52 def check_url(self, url):
53 return url
54
55 @pytest.mark.usefixtures('session') # I don't know why this is required
56 def test_fails_without_authorization_header(self, test_client):
57 res = test_client.post(
58 self.check_url('/v2/agent_websocket_token/')
59 )
60 assert res.status_code == 401
61
62 @pytest.mark.usefixtures('logged_user')
63 def test_fails_with_logged_user(self, test_client):
64 res = test_client.post(
65 self.check_url('/v2/agent_websocket_token/')
66 )
67 assert res.status_code == 401
68
69 @pytest.mark.usefixtures('logged_user')
70 def test_fails_with_user_token(self, test_client, session):
71 res = test_client.get(self.check_url('/v2/token/'))
72
73 assert res.status_code == 200
74
75 headers = [('Authorization', 'Token ' + res.json)]
76
77 # clean cookies make sure test_client has no session
78 test_client.cookie_jar.clear()
79 res = test_client.post(
80 self.check_url('/v2/agent_websocket_token/'),
81 headers=headers,
82 )
83 assert res.status_code == 401
84
85 @pytest.mark.usefixtures('session')
86 def test_fails_with_invalid_agent_token(self, test_client):
87 headers = [('Authorization', 'Agent 13123')]
88 res = test_client.post(
89 self.check_url('/v2/agent_websocket_token/'),
90 headers=headers,
91 )
92 assert res.status_code == 403
93
94 @pytest.mark.usefixtures('session')
95 def test_succeeds_with_agent_token(self, test_client, agent, session):
96 session.add(agent)
97 session.commit()
98 assert agent.token
99 headers = [('Authorization', 'Agent ' + agent.token)]
100 res = test_client.post(
101 self.check_url('/v2/agent_websocket_token/'),
102 headers=headers,
103 )
104 assert res.status_code == 200
105 decoded_agent = decode_agent_websocket_token(res.json['token'])
106 assert decoded_agent == agent
107
108
109 class TestBasicAuth:
110
111 def check_url(self, url):
112 return url
113
114 def test_basic_auth_invalid_credentials(self, test_client, session):
115 """
116 Use of invalid Basic Auth credentials
117 """
118
119 alice = factories.UserFactory.create(
120 active=True,
121 username='asdasd',
122 password=hash_password('asdasd'),
123 role='admin')
124 session.add(alice)
125 session.commit()
126
127 agent = factories.AgentFactory.create()
128 session.add(agent)
129 session.commit()
130
131 valid_credentials = base64.b64encode(b"asdasd:wrong_password").decode("utf-8")
132 headers = [('Authorization', f'Basic {valid_credentials}')]
133 res = test_client.get(self.check_url('/v2/agents/'), headers=headers)
134 assert res.status_code == 401
135
136 def test_basic_auth_valid_credentials(self, test_client, session):
137 """
138 Use of valid Basic Auth credentials
139 """
140
141 alice = factories.UserFactory.create(
142 active=True,
143 username='asdasd',
144 password=hash_password('asdasd'),
145 role='admin')
146 session.add(alice)
147 session.commit()
148
149 agent = factories.AgentFactory.create()
150 session.add(agent)
151 session.commit()
152
153 valid_credentials = base64.b64encode(b"asdasd:asdasd").decode("utf-8")
154 headers = [('Authorization', f'Basic {valid_credentials}')]
155 res = test_client.get(self.check_url('/v2/agents/'), headers=headers)
156 assert res.status_code == 200
157
158
159 class TestAgentWebsocketTokenV3(TestAgentWebsocketToken):
160 def check_url(self, url):
161 return v2_to_v3(url)
162
163
164 class TestBasicAuthV3(TestBasicAuth):
165 def check_url(self, url):
166 return v2_to_v3(url)
00 from datetime import datetime, timedelta, timezone
1 import string
21
32 import pytest
43 from marshmallow import ValidationError
4 from sqlalchemy import true, null, false
5
56 from faraday.server.models import (
67 db,
78 Command,
5455 'status_code': 200,
5556 }
5657
57
5858 credential_data = {
5959 'name': 'test credential',
6060 'description': 'test',
6161 'username': 'admin',
6262 'password': '12345',
6363 }
64
6564
6665 command_data = {
6766 'tool': 'pytest',
115114 assert host.ip == "127.0.0.1"
116115 assert set({hn.name for hn in host.hostnames}) == {"test.com", "test2.org", "test3.org"}
117116
117
118118 def test_create_existing_host(session, host):
119119 session.add(host)
120120 session.commit()
159159 data = bc.BulkServiceSchema().load(data)
160160 bc._create_service(service.workspace, service.host, data)
161161 assert count(Service, service.host.workspace) == 1
162
162163
163164 def test_create_host_vuln(session, host):
164165 data = bc.VulnerabilitySchema().load(vuln_data)
215216 dict(
216217 command=command_data,
217218 hosts=[host_data_]
218 )
219 )
219220 )
220221 assert count(Vulnerability, service.workspace) == 1
221222 vuln = service.workspace.vulnerabilities[0]
238239 assert count(Vulnerability, service.workspace) == 1
239240 vuln = service.workspace.vulnerabilities[0]
240241 assert vuln.tool == command_data['tool']
242
241243
242244 def test_cannot_create_host_vulnweb(session, host):
243245 data = vuln_data.copy()
427429 service = host.services[0]
428430 vuln_host = Vulnerability.query.filter(
429431 Vulnerability.workspace == workspace,
430 Vulnerability.service == None).one()
432 Vulnerability.service == null()).one()
431433 vuln_service = Vulnerability.query.filter(
432434 Vulnerability.workspace == workspace,
433 Vulnerability.host == None).one()
435 Vulnerability.host == null()).one()
434436 vuln_web = VulnerabilityWeb.query.filter(
435437 VulnerabilityWeb.workspace == workspace).one()
436438 host_cred = Credential.query.filter(
457459 CommandObject.command == command,
458460 CommandObject.object_type == table_name,
459461 CommandObject.object_id == obj.id,
460 CommandObject.created_persistent == True,
462 CommandObject.created_persistent == true(),
461463 ).one()
462464
463465
566568 CommandObject.command == new_command,
567569 CommandObject.object_type == table_name,
568570 CommandObject.object_id == obj.id,
569 CommandObject.created_persistent == False,
571 CommandObject.created_persistent == false(),
570572 ).one()
571573
572574
829831 )
830832 assert res.status_code == 400
831833
832 assert Host.query.filter(Host.workspace == workspace and Host.creator_id is None).count() == initial_host_count
834 assert Host.query.filter(
835 Host.workspace == workspace and Host.creator_id is None).count() == initial_host_count
833836 assert count(Command, workspace) == 1
834837 data_kwargs["execution_id"] = extra_agent_execution.id
835838 res = test_client.post(
838841 headers=[("authorization", f"agent {agent.token}")]
839842 )
840843 assert res.status_code == 400
841 assert Host.query.filter(Host.workspace == workspace and Host.creator_id is None).count() == initial_host_count
844 assert Host.query.filter(
845 Host.workspace == workspace and Host.creator_id is None).count() == initial_host_count
842846 assert count(Command, workspace) == 1
843847 data_kwargs["execution_id"] = agent_execution.id
844848 res = test_client.post(
907911 session.add(workspace)
908912 session.commit()
909913 for workspace in agent.workspaces:
910
911914 url = self.check_url(f'/v2/ws/{workspace.name}/bulk_create/')
912915 res = test_client.post(
913916 url,
10001003 host_data_['services'] = [service_data]
10011004 host_data_['credentials'] = [credential_data]
10021005 host_data_['vulnerabilities'] = [vuln_data]
1003 host_data_['default_gateway'] = ["localhost"] # Can not be a list
1006 host_data_['default_gateway'] = ["localhost"] # Can not be a list
10041007 res = test_client.post(url, data=dict(hosts=[host_data_]))
10051008 assert res.status_code == 400, res.json
10061009 assert count(Host, workspace) == 0
0 #-*- coding: utf8 -*-
0 # -*- coding: utf8 -*-
11 '''
22 Faraday Penetration Test IDE
33 Copyright (C) 2013 Infobyte LLC (http://www.infobytesec.com/)
1313 import time
1414
1515 from tests import factories
16 from tests.test_api_workspaced_base import API_PREFIX, ReadWriteAPITests, PatchableTestsMixin
16 from tests.test_api_workspaced_base import ReadWriteAPITests, PatchableTestsMixin
1717 from faraday.server.models import (
1818 Command,
19 Workspace,
2019 Vulnerability)
2120 from faraday.server.api.modules.commandsrun import CommandView, CommandV3View
22 from faraday.server.api.modules.workspaces import WorkspaceView
2321 from tests.factories import VulnerabilityFactory, EmptyCommandFactory, CommandObjectFactory, HostFactory, \
2422 WorkspaceFactory, ServiceFactory
2523
4644 def test_list_retrieves_all_items_from_workspace(self, test_client,
4745 second_workspace,
4846 session):
49 super(TestListCommandView, self).test_list_retrieves_all_items_from_workspace(test_client, second_workspace, session)
47 super().test_list_retrieves_all_items_from_workspace(test_client, second_workspace, session)
5048
5149 @pytest.mark.usefixtures('ignore_nplusone')
5250 def test_backwards_compatibility_list(self, test_client, second_workspace, session):
7775
7876 @pytest.mark.usefixtures('ignore_nplusone')
7977 def test_can_list_readonly(self, test_client, session):
80 super(TestListCommandView, self).test_can_list_readonly(test_client, session)
78 super().test_can_list_readonly(test_client, session)
8179
8280 def test_activity_feed(self, session, test_client):
8381 command = self.factory.create()
115113 u'criticalIssue': 0}]
116114
117115 assert list(filter(lambda stats: stats['_id'] == another_command.id,
118 res.json)) == [{
119 u'_id': another_command.id,
120 u'command': another_command.command,
121 u'import_source': u'shell',
122 u'tool': another_command.tool,
123 u'user': another_command.user,
124 u'date': time.mktime(
125 another_command.start_date.timetuple()) * 1000,
126 u'params': another_command.params,
127 u'hosts_count': 0,
128 u'services_count': 0,
129 u'vulnerabilities_count': 0,
130 u'criticalIssue': 0}]
116 res.json)) == [{
117 u'_id': another_command.id,
118 u'command': another_command.command,
119 u'import_source': u'shell',
120 u'tool': another_command.tool,
121 u'user': another_command.user,
122 u'date': time.mktime(
123 another_command.start_date.timetuple()) * 1000,
124 u'params': another_command.params,
125 u'hosts_count': 0,
126 u'services_count': 0,
127 u'vulnerabilities_count': 0,
128 u'criticalIssue': 0}]
131129
132130 def test_verify_created_critical_vulns_is_correctly_showing_sum_values(self, session, test_client):
133131 workspace = WorkspaceFactory.create()
158156 res = test_client.get(self.check_url(urljoin(self.url(workspace=command.workspace), 'activity_feed/')))
159157 assert res.status_code == 200
160158 assert res.json == [
161 {u'_id': command.id,
162 u'command': command.command,
163 u'import_source': u'shell',
164 u'tool': command.tool,
165 u'user': command.user,
166 u'date': time.mktime(command.start_date.timetuple()) * 1000,
167 u'params': command.params,
168 u'hosts_count': 1,
169 u'services_count': 0,
170 u'vulnerabilities_count': 2,
171 u'criticalIssue': 1}
172 ]
159 {u'_id': command.id,
160 u'command': command.command,
161 u'import_source': u'shell',
162 u'tool': command.tool,
163 u'user': command.user,
164 u'date': time.mktime(command.start_date.timetuple()) * 1000,
165 u'params': command.params,
166 u'hosts_count': 1,
167 u'services_count': 0,
168 u'vulnerabilities_count': 2,
169 u'criticalIssue': 1}
170 ]
173171
174172 def test_verify_created_vulns_with_host_and_service_verification(self, session, test_client):
175173 workspace = WorkspaceFactory.create()
291289 for in_the_middle_command in in_the_middle_commands:
292290 raw_in_the_middle_command = list(filter(lambda comm: comm['_id'] == in_the_middle_command.id, res.json))
293291 assert raw_in_the_middle_command.pop() == {u'_id': in_the_middle_command.id,
294 u'command': in_the_middle_command.command,
295 u'import_source': u'shell',
296 u'user': in_the_middle_command.user,
297 u'date': time.mktime(in_the_middle_command.start_date.timetuple()) * 1000,
298 u'params': in_the_middle_command.params,
299 u'hosts_count': 0,
300 u'tool': in_the_middle_command.tool,
301 u'services_count': 0,
302 u'vulnerabilities_count': 0,
303 u'criticalIssue': 0}
292 u'command': in_the_middle_command.command,
293 u'import_source': u'shell',
294 u'user': in_the_middle_command.user,
295 u'date': time.mktime(
296 in_the_middle_command.start_date.timetuple()) * 1000,
297 u'params': in_the_middle_command.params,
298 u'hosts_count': 0,
299 u'tool': in_the_middle_command.tool,
300 u'services_count': 0,
301 u'vulnerabilities_count': 0,
302 u'criticalIssue': 0}
304303
305304 # new command must create new service and vuln
306305 raw_last_command = list(filter(lambda comm: comm['_id'] == last_command.id, res.json))
307306 assert raw_last_command.pop() == {u'_id': last_command.id,
308 u'command': last_command.command,
309 u'import_source': u'shell',
310 u'user': last_command.user,
311 u'date': time.mktime(last_command.start_date.timetuple()) * 1000,
312 u'params': last_command.params,
313 u'hosts_count': 0,
314 u'tool': last_command.tool,
315 u'services_count': 1,
316 u'vulnerabilities_count': 1,
317 u'criticalIssue': 0}
307 u'command': last_command.command,
308 u'import_source': u'shell',
309 u'user': last_command.user,
310 u'date': time.mktime(last_command.start_date.timetuple()) * 1000,
311 u'params': last_command.params,
312 u'hosts_count': 0,
313 u'tool': last_command.tool,
314 u'services_count': 1,
315 u'vulnerabilities_count': 1,
316 u'criticalIssue': 0}
318317
319318 @pytest.mark.usefixtures('ignore_nplusone')
320319 def test_sub_second_command_returns_correct_duration_value(self, test_client):
367366 assert res.json['commands'][0]['value']['duration'].lower() == "in progress"
368367
369368 def test_create_command(self, test_client):
370 raw_data ={
369 raw_data = {
371370 'command': 'Import Nessus:',
372371 'tool': 'nessus',
373372 'duration': None,
432431 assert command_history['tool'] == 'test'
433432
434433 def test_year_is_out_range(self, test_client):
435 raw_data ={
434 raw_data = {
436435 'command': 'Import Nessus:',
437436 'tool': 'nessus',
438437 'duration': None,
452451 view_class = CommandV3View
453452
454453 def url(self, obj=None, workspace=None):
455 return v2_to_v3(super(TestListCommandViewV3, self).url(obj, workspace))
454 return v2_to_v3(super().url(obj, workspace))
456455
457456 def check_url(self, url):
458457 return v2_to_v3(url)
7676 assert res.status_code == 400
7777 assert res.json == {u'message': u"Can't comment inexistent object"}
7878
79
8079 def test_create_unique_comment_for_plugins(self, session, test_client):
8180 """
8281
124123 factories.CommentFactory.create(workspace=workspace, text='third')
125124 factories.CommentFactory.create(workspace=workspace, text='fourth')
126125 get_comments = test_client.get(self.url(workspace=workspace))
127 expected = ['first', 'second', 'third','fourth']
126 expected = ['first', 'second', 'third', 'fourth']
128127 assert expected == [comment['text'] for comment in get_comments.json]
129128
130129
132131 view_class = CommentV3View
133132
134133 def url(self, obj=None, workspace=None):
135 return v2_to_v3(super(TestCommentAPIGenericV3, self).url(obj, workspace))
134 return v2_to_v3(super().url(obj, workspace))
136135
137136 def check_url(self, url):
138137 return v2_to_v3(url)
5656 host = host_factory.create(workspace=workspace)
5757 session.commit()
5858 raw_data = {
59 "_id":"1.e5069bb0718aa519852e6449448eedd717f1b90d",
60 "name":"name",
61 "username":"username",
62 "metadata":{"update_time":1508794240799,"update_user":"",
63 "update_action":0,"creator":"UI Web",
64 "create_time":1508794240799,"update_controller_action":"",
65 "owner":""},
66 "password":"pass",
67 "type":"Cred",
68 "owner":"",
69 "description":"",
59 "_id": "1.e5069bb0718aa519852e6449448eedd717f1b90d",
60 "name": "name",
61 "username": "username",
62 "metadata": {"update_time": 1508794240799, "update_user": "",
63 "update_action": 0, "creator": "UI Web",
64 "create_time": 1508794240799, "update_controller_action": "",
65 "owner": ""},
66 "password": "pass",
67 "type": "Cred",
68 "owner": "",
69 "description": "",
7070 "parent": host.id,
7171 "parent_type": "Host"
7272 }
8181 service = service_factory.create(workspace=workspace)
8282 session.commit()
8383 raw_data = {
84 "_id":"1.e5069bb0718aa519852e6449448eedd717f1b90d",
85 "name":"name",
86 "username":"username",
87 "metadata":{"update_time":1508794240799,"update_user":"",
88 "update_action":0,"creator":"UI Web",
89 "create_time":1508794240799,"update_controller_action":"",
90 "owner":""},
91 "password":"pass",
92 "type":"Cred",
93 "owner":"",
94 "description":"",
84 "_id": "1.e5069bb0718aa519852e6449448eedd717f1b90d",
85 "name": "name",
86 "username": "username",
87 "metadata": {"update_time": 1508794240799, "update_user": "",
88 "update_action": 0, "creator": "UI Web",
89 "create_time": 1508794240799, "update_controller_action": "",
90 "owner": ""},
91 "password": "pass",
92 "type": "Cred",
93 "owner": "",
94 "description": "",
9595 "parent": service.id,
9696 "parent_type": "Service"
9797 }
152152 service = service_factory.create(workspace=workspace)
153153 session.commit()
154154 raw_data = {
155 "_id":"1.e5069bb0718aa519852e6449448eedd717f1b90d",
156 "name":"name",
157 "username":"username",
158 "metadata":{"update_time":1508794240799,"update_user":"",
159 "update_action":0,"creator":"UI Web",
160 "create_time":1508794240799,"update_controller_action":"",
161 "owner":""},
162 "password":"pass",
163 "type":"Cred",
164 "owner":"",
165 "description":"",
155 "_id": "1.e5069bb0718aa519852e6449448eedd717f1b90d",
156 "name": "name",
157 "username": "username",
158 "metadata": {"update_time": 1508794240799, "update_user": "",
159 "update_action": 0, "creator": "UI Web",
160 "create_time": 1508794240799, "update_controller_action": "",
161 "owner": ""},
162 "password": "pass",
163 "type": "Cred",
164 "owner": "",
165 "description": "",
166166 "parent": service.id,
167167 "parent_type": "Vulnerability"
168168 }
169169 res = test_client.post(self.url(), data=raw_data)
170170 assert res.status_code == 400
171171 assert res.json['messages']['json']['_schema'] == ['Unknown parent type: Vulnerability']
172
173172
174173 def test_update_credentials(self, test_client, session, host):
175174 credential = self.factory.create(host=host, service=None,
237236 assert res.status_code == 400
238237 assert b'Parent id not found' in res.data
239238
240
241239 def test_sort_credentials_target(self, test_client, second_workspace):
242240 host = HostFactory(workspace=second_workspace, ip="192.168.1.1")
243241 service = ServiceFactory(name="http", workspace=second_workspace, host=host)
260258 # Desc order
261259 response = test_client.get(self.url(workspace=second_workspace) + "?sort=target&sort_dir=desc")
262260 assert response.status_code == 200
263 assert sorted(credentials_target, reverse=True) == [ v['value']['target'] for v in response.json['rows']]
261 assert sorted(credentials_target, reverse=True) == [v['value']['target'] for v in response.json['rows']]
264262
265263 # Asc order
266264 response = test_client.get(self.url(workspace=second_workspace) + "?sort=target&sort_dir=asc")
272270 view_class = CredentialV3View
273271
274272 def url(self, obj=None, workspace=None):
275 return v2_to_v3(super(TestCredentialsAPIGenericV3, self).url(obj, workspace))
273 return v2_to_v3(super().url(obj, workspace))
0
10 import pytest
21
32 from tests.factories import CustomFieldsSchemaFactory
1514 model = CustomFieldsSchema
1615 factory = CustomFieldsSchemaFactory
1716 api_endpoint = 'custom_fields_schema'
18 #unique_fields = ['ip']
19 #update_fields = ['ip', 'description', 'os']
17 # unique_fields = ['ip']
18 # update_fields = ['ip', 'description', 'os']
2019 view_class = CustomFieldsSchemaView
2120 patchable_fields = ['field_name']
2221
3332
3433 res = test_client.get(self.url())
3534 assert res.status_code == 200
36 assert {u'table_name': u'vulnerability', u'id': add_text_field.id, u'field_type': u'text', u'field_name': u'cvss', u'field_display_name': u'CVSS', u'field_metadata': None, u'field_order': 1} in res.json
35 assert {u'table_name': u'vulnerability', u'id': add_text_field.id, u'field_type': u'text',
36 u'field_name': u'cvss', u'field_display_name': u'CVSS', u'field_metadata': None,
37 u'field_order': 1} in res.json
3738
3839 def test_custom_fields_field_name_cant_be_changed(self, session, test_client):
3940 add_text_field = CustomFieldsSchemaFactory.create(
8485
8586 class TestVulnerabilityCustomFieldsV3(TestVulnerabilityCustomFields, PatchableTestsMixin):
8687 def url(self, obj=None):
87 return v2_to_v3(super(TestVulnerabilityCustomFieldsV3, self).url(obj))
88 return v2_to_v3(super().url(obj))
11
22 import yaml
33 from apispec import APISpec
4 from faraday.server.web import app
4 from faraday.server.web import get_app
55 from apispec.ext.marshmallow import MarshmallowPlugin
66 from apispec_webframeworks.flask import FlaskPlugin
77 from faraday.utils.faraday_openapi_plugin import FaradayAPIPlugin
2929 exc = {'/login', '/logout', '/change', '/reset', '/reset/{token}', '/verify'}
3030 failing = []
3131
32 with app.test_request_context():
33 for endpoint in app.view_functions:
34 spec.path(view=app.view_functions[endpoint], app=app)
32 with get_app().test_request_context():
33 for endpoint in get_app().view_functions:
34 spec.path(view=get_app().view_functions[endpoint], app=get_app())
3535
3636 spec_yaml = yaml.load(spec.to_yaml(), Loader=yaml.BaseLoader)
3737
5454
5555 failing = []
5656
57 with app.test_request_context():
58 for endpoint in app.view_functions:
59 spec.path(view=app.view_functions[endpoint], app=app)
57 with get_app().test_request_context():
58 for endpoint in get_app().view_functions:
59 spec.path(view=get_app().view_functions[endpoint], app=get_app())
6060
6161 spec_yaml = yaml.load(spec.to_yaml(), Loader=yaml.BaseLoader)
6262
8080
8181 tags = set()
8282
83 with app.test_request_context():
84 for endpoint in app.view_functions:
85 spec.path(view=app.view_functions[endpoint], app=app)
83 with get_app().test_request_context():
84 for endpoint in get_app().view_functions:
85 spec.path(view=get_app().view_functions[endpoint], app=get_app())
8686
8787 spec_yaml = yaml.load(spec.to_yaml(), Loader=yaml.BaseLoader)
8888
55 '''
66
77 import pytest
8 from lxml.etree import fromstring, tostring
8 from lxml.etree import fromstring
99
1010 from tests.conftest import TEST_DATA_PATH
1111 from tests.factories import (
00 import re
1 from faraday.server.web import app
1 from faraday.server.web import get_app
22
33 placeholders = {
44 r".*(<int:.*>).*": "1"
1515
1616
1717 def test_options(test_client):
18 for rule in app.url_map.iter_rules():
18 for rule in get_app().url_map.iter_rules():
1919 if 'OPTIONS' in rule.methods:
2020 res = test_client.options(replace_placeholders(rule.rule))
2121 assert res.status_code == 200, rule.rule
2323
2424 def test_v3_endpoints():
2525 rules = list(
26 filter(lambda rule: rule.rule.startswith("/v3") and rule.rule.endswith("/"), app.url_map.iter_rules())
26 filter(lambda rule: rule.rule.startswith("/v3") and rule.rule.endswith("/"), get_app().url_map.iter_rules())
2727 )
2828 assert len(rules) == 0, [rule.rule for rule in rules]
2929
4040 rules_v2 = set(
4141 map(
4242 lambda rule: rule.rule.replace("v2", "v3").rstrip("/"),
43 filter(lambda rule: rule.rule.startswith("/v2"), app.url_map.iter_rules())
43 filter(lambda rule: rule.rule.startswith("/v2"), get_app().url_map.iter_rules())
4444 )
4545 )
4646 rules = set(
47 map(lambda rule: rule.rule, filter(lambda rule: rule.rule.startswith("/v3"), app.url_map.iter_rules()))
47 map(lambda rule: rule.rule, filter(lambda rule: rule.rule.startswith("/v3"), get_app().url_map.iter_rules()))
4848 )
4949 exceptions_present_v2 = rules_v2.intersection(exceptions)
5050 assert len(exceptions_present_v2) == len(exceptions), sorted(exceptions_present_v2)
55 '''
66
77 import pytest
8
89
910 @pytest.mark.skip(reason='occassionaly timeouts')
1011 @pytest.mark.usefixtures('logged_user')
1414 from urllib.parse import urlencode
1515 from random import choice
1616 from sqlalchemy.orm.util import was_deleted
17 from hypothesis import given, assume, settings, strategies as st
17 from hypothesis import given, strategies as st
1818
1919 import pytest
2020
2626 )
2727 from faraday.server.models import db, Host, Hostname
2828 from faraday.server.api.modules.hosts import HostsView, HostsV3View
29 from tests.factories import HostFactory, CommandFactory, \
30 EmptyCommandFactory, WorkspaceFactory
29 from tests.factories import HostFactory, EmptyCommandFactory, WorkspaceFactory
3130
3231 HOSTS_COUNT = 5
3332 SERVICE_COUNT = [10, 5] # 10 services to the first host, 5 to the second
33
3434
3535 @pytest.mark.usefixtures('database', 'logged_user')
3636 class TestHostAPI:
126126 res = test_client.post(self.url(), data={
127127 "ip": "127.0.0.1",
128128 "description": "aaaaa",
129 "_rev":"saraza"
129 "_rev": "saraza"
130130 # os is not required
131131 })
132132 assert res.status_code == 201
290290
291291 @pytest.mark.usefixtures('ignore_nplusone')
292292 def test_filter_restless_by_os_exact(self, test_client, session, workspace,
293 second_workspace, host_factory):
293 second_workspace, host_factory):
294294 # The hosts that should be shown
295295 hosts = host_factory.create_batch(10, workspace=workspace, os='Unix')
296296
307307
308308 @pytest.mark.usefixtures('ignore_nplusone')
309309 def test_filter_restless_count(self, test_client, session, workspace,
310 second_workspace, host_factory):
310 second_workspace, host_factory):
311311 # The hosts that should be shown
312312 hosts = host_factory.create_batch(30, workspace=workspace, os='Unix')
313313
326326 host_factory.create_batch(1, workspace=workspace, os='unix')
327327 session.commit()
328328 res = test_client.get(urljoin(self.url(), 'filter?q={"filters":[{"name": "os", "op": "like", "val": "%nix"}], '
329 '"group_by":[{"field": "os"}], "order_by":[{"field": "os", "direction": "desc"}]}'))
329 '"group_by":[{"field": "os"}], "order_by":[{"field": "os", "direction": "desc"}]}'))
330330 assert res.status_code == 200
331331 assert len(res.json['rows']) == 2
332332 assert res.json['count'] == 2
360360
361361 @pytest.mark.usefixtures('ignore_nplusone')
362362 def test_filter_restless_by_os_like_ilike(self, test_client, session, workspace,
363 second_workspace, host_factory):
363 second_workspace, host_factory):
364364 # The hosts that should be shown
365365 hosts = host_factory.create_batch(5, workspace=workspace, os='Unix 1')
366366 hosts += host_factory.create_batch(5, workspace=workspace, os='Unix 2')
379379 res = test_client.get(urljoin(
380380 self.url(),
381381 'filter?q={"filters":[{"name": "os", "op":"like", "val":"Unix %"}]}'
382 )
382 )
383383 )
384384 assert res.status_code == 200
385385 self.compare_results(hosts, res)
387387 res = test_client.get(urljoin(
388388 self.url(),
389389 'filter?q={"filters":[{"name": "os", "op":"ilike", "val":"Unix %"}]}'
390 )
390 )
391391 )
392392 assert res.status_code == 200
393393 self.compare_results(hosts + [case_insensitive_host], res)
410410
411411 @pytest.mark.usefixtures('ignore_nplusone')
412412 def test_filter_restless_by_service_name(self, test_client, session, workspace,
413 service_factory, host_factory):
413 service_factory, host_factory):
414414 services = service_factory.create_batch(10, workspace=workspace,
415415 name="IRC")
416416 hosts = [service.host for service in services]
431431 expected_host_ids = set(host.id for host in hosts)
432432 assert shown_hosts_ids == expected_host_ids
433433
434
435434 def test_filter_by_service_port(self, test_client, session, workspace,
436 service_factory, host_factory):
435 service_factory, host_factory):
437436 services = service_factory.create_batch(10, workspace=workspace, port=25)
438437 hosts = [service.host for service in services]
439438
447446 expected_host_ids = set(host.id for host in hosts)
448447 assert shown_hosts_ids == expected_host_ids
449448
450
451449 @pytest.mark.usefixtures('ignore_nplusone')
452450 def test_filter_restless_by_service_port(self, test_client, session, workspace,
453 service_factory, host_factory):
451 service_factory, host_factory):
454452 services = service_factory.create_batch(10, workspace=workspace, port=25)
455453 hosts = [service.host for service in services]
456454
493491
494492 assert res.status_code == 200
495493
496 severities = res.json['rows'][0]['value']['severity_counts']
494 severities = res.json['rows'][0]['value']['severity_counts']
497495 assert severities['info'] == 1
498496 assert severities['critical'] == 2
499497 assert severities['high'] == 1
503501 assert severities['total'] == 5
504502
505503 def test_filter_by_invalid_service_port(self, test_client, session, workspace,
506 service_factory, host_factory):
504 service_factory, host_factory):
507505 services = service_factory.create_batch(10, workspace=workspace, port=25)
508506 hosts = [service.host for service in services]
509507
516514 assert res.json['count'] == 0
517515
518516 def test_filter_restless_by_invalid_service_port(self, test_client, session, workspace,
519 service_factory, host_factory):
517 service_factory, host_factory):
520518 services = service_factory.create_batch(10, workspace=workspace, port=25)
521519 hosts = [service.host for service in services]
522520
543541
544542 @pytest.mark.usefixtures('ignore_nplusone')
545543 def test_filter_restless_with_no_q_param(self, test_client, session, workspace, host_factory):
546 res = test_client.get(urljoin(self.url(),'filter'))
544 res = test_client.get(urljoin(self.url(), 'filter'))
547545 assert res.status_code == 200
548546 assert len(res.json['rows']) == HOSTS_COUNT
549547
619617 vulnerability_factory.create(service=service, host=None, workspace=workspace)
620618 session.commit()
621619
622 res = test_client.get(self.check_url(urljoin(self.url(host),'services/')))
620 res = test_client.get(self.check_url(urljoin(self.url(host), 'services/')))
623621 assert res.status_code == 200
624622 assert res.json[0]['vulns'] == 1
625623
674672 session.commit()
675673 raw_data = {
676674 "metadata":
677 {
678 "update_time":1510688312.431,
679 "update_user":"UI Web",
680 "update_action":0,
681 "creator":"",
682 "create_time":1510673388000,
683 "update_controller_action":"",
684 "owner":"leonardo",
685 "command_id": None},
686 "name":"10.31.112.21",
687 "ip":"10.31.112.21",
688 "_rev":"",
689 "description":"",
675 {
676 "update_time": 1510688312.431,
677 "update_user": "UI Web",
678 "update_action": 0,
679 "creator": "",
680 "create_time": 1510673388000,
681 "update_controller_action": "",
682 "owner": "leonardo",
683 "command_id": None},
684 "name": "10.31.112.21",
685 "ip": "10.31.112.21",
686 "_rev": "",
687 "description": "",
690688 "default_gateway": None,
691689 "owned": False,
692 "services":12,
693 "hostnames":[],
694 "vulns":43,
695 "owner":"leonardo",
696 "credentials":0,
690 "services": 12,
691 "hostnames": [],
692 "vulns": 43,
693 "owner": "leonardo",
694 "credentials": 0,
697695 "_id": 4000,
698 "os":"Microsoft Windows Server 2008 R2 Standard Service Pack 1",
696 "os": "Microsoft Windows Server 2008 R2 Standard Service Pack 1",
699697 "id": 4000,
700 "icon":"windows",
698 "icon": "windows",
701699 "versions": [],
702700 "important": False,
703701 }
835833
836834 class TestHostAPIV3(TestHostAPI):
837835 def url(self, host=None, workspace=None):
838 return v2_to_v3(super(TestHostAPIV3, self).url(host, workspace))
836 return v2_to_v3(super().url(host, workspace))
839837
840838 def check_url(self, url):
841839 return v2_to_v3(url)
902900 session.flush()
903901 expected_ids.append(host.id)
904902 session.commit()
905 res = test_client.get(self.url(workspace=second_workspace) +
906 '?sort=services&sort_dir=asc')
903 res = test_client.get(self.url(workspace=second_workspace)
904 + '?sort=services&sort_dir=asc')
907905 assert res.status_code == 200
908906 assert [h['_id'] for h in res.json['data']] == expected_ids
909907
918916 expected = host_factory.create_batch(10, workspace=second_workspace)
919917 session.commit()
920918 for i in range(len(expected)):
921 if i % 2 == 0: # Only update some hosts
919 if i % 2 == 0: # Only update some hosts
922920 host = expected.pop(0)
923921 host.description = 'i was updated'
924922 session.add(host)
925923 session.commit()
926924 expected.append(host) # Put it on the end
927 res = test_client.get(self.url(workspace=second_workspace) +
928 '?sort=metadata.update_time&sort_dir=asc')
925 res = test_client.get(self.url(workspace=second_workspace)
926 + '?sort=metadata.update_time&sort_dir=asc')
929927 assert res.status_code == 200, res.data
930928 assert [h['_id'] for h in res.json['data']] == [h.id for h in expected]
931929
10351033 session.add(host)
10361034 session.commit()
10371035 data = {
1038 "description":"",
1039 "default_gateway":"",
1040 "ip":"127.0.0.1",
1041 "owned":False,
1042 "name":"127.0.0.1",
1043 "mac":"",
1044 "hostnames":["dasdas"],
1045 "owner":"faraday",
1046 "os":"Unknown",
1036 "description": "",
1037 "default_gateway": "",
1038 "ip": "127.0.0.1",
1039 "owned": False,
1040 "name": "127.0.0.1",
1041 "mac": "",
1042 "hostnames": ["dasdas"],
1043 "owner": "faraday",
1044 "os": "Unknown",
10471045 }
10481046
10491047 res = test_client.put(self.url(host, workspace=host.workspace), data=data)
11401138 view_class = HostsV3View
11411139
11421140 def url(self, obj=None, workspace=None):
1143 return v2_to_v3(super(TestHostAPIGenericV3, self).url(obj, workspace))
1141 return v2_to_v3(super().url(obj, workspace))
11441142
11451143
11461144 def host_json():
11551153 "create_time": st.integers(),
11561154 "update_controller_action": st.text(),
11571155 "owner": st.one_of(st.none(), st.text()),
1158 "command_id": st.one_of(st.none(), st.text(), st.integers()),}),
1156 "command_id": st.one_of(st.none(), st.text(), st.integers()), }),
11591157 "name": st.one_of(st.none(), st.text()),
11601158 "ip": st.one_of(st.none(), st.text()),
11611159 "_rev": st.one_of(st.none(), st.text()),
11821180
11831181 @given(HostData)
11841182 def send_api_request(raw_data):
1185
11861183 ws_name = host_with_hostnames.workspace.name
11871184 res = test_client.post(f'/v2/ws/{ws_name}/vulns/',
11881185 data=raw_data)
11901187
11911188 @given(HostData)
11921189 def send_api_request_v3(raw_data):
1193
11941190 ws_name = host_with_hostnames.workspace.name
11951191 res = test_client.post(f'/v3/ws/{ws_name}/vulns',
11961192 data=raw_data)
0 #-*- coding: utf8 -*-
0 # -*- coding: utf8 -*-
11 '''
22 Faraday Penetration Test IDE
33 Copyright (C) 2013 Infobyte LLC (http://www.infobytesec.com/)
5555
5656 class TestLicensesAPIV3(TestLicensesAPI, PatchableTestsMixin):
5757 def url(self, obj=None):
58 return v2_to_v3(super(TestLicensesAPIV3, self).url(obj))
58 return v2_to_v3(super().url(obj))
5959
6060 @pytest.mark.skip(reason="Not a license actually test")
6161 def test_envelope_list(self, test_client, app):
7373 "creator": st.one_of(st.none(), st.text()),
7474 "create_time": st.floats(),
7575 "update_controller_action": st.one_of(st.none(), st.text()),
76 "owner": st.one_of(st.none(), st.text())}),
76 "owner": st.one_of(st.none(), st.text())}),
7777 "notes": st.one_of(st.none(), st.text()),
7878 "product": st.one_of(st.none(), st.text()),
7979 "start": st.datetimes(),
8080 "end": st.datetimes(),
8181 "type": st.one_of(st.none(), st.text())
82 })
82 })
8383
8484
8585 @pytest.mark.usefixtures('logged_user')
33 from itsdangerous import TimedJSONWebSignatureSerializer
44
55 from faraday.server.models import User
6 from faraday.server.web import app
6 from faraday.server.web import get_app
77 from tests import factories
8 from tests.conftest import logged_user, login_as
98 from tests.utils.url import v2_to_v3
109
1110
2928 session.commit()
3029 # we use lower case username, but in db is Capitalized
3130 login_payload = {
32 'email': 'susan',
31 'email': 'Susan',
3332 'password': 'pepito',
3433 }
3534 res = test_client.post('/login', data=login_payload)
7271 """
7372 # clean cookies make sure test_client has no session
7473 test_client.cookie_jar.clear()
75 secret_key = app.config['SECRET_KEY']
74 secret_key = get_app().config['SECRET_KEY']
7675 alice = factories.UserFactory.create(
7776 active=True,
7877 username='alice',
8584 session.add(ws)
8685 session.commit()
8786
88 serializer = TimedJSONWebSignatureSerializer(app.config['SECRET_KEY'], expires_in=500, salt="token")
89 token = serializer.dumps({ 'user_id': alice.id})
87 serializer = TimedJSONWebSignatureSerializer(get_app().config['SECRET_KEY'], expires_in=500, salt="token")
88 token = serializer.dumps({'user_id': alice.id})
9089
9190 headers = {'Authorization': b'Token ' + token}
9291
111110 assert 'Set-Cookie' not in res.headers
112111 cookies = [cookie.name for cookie in test_client.cookie_jar]
113112 assert "faraday_session_2" not in cookies
114
115113
116114 def test_cant_retrieve_token_unauthenticated(self, test_client):
117115 # clean cookies make sure test_client has no session
0 #-*- coding: utf8 -*-
0 # -*- coding: utf8 -*-
11 '''
22 Faraday Penetration Test IDE
33 Copyright (C) 2013 Infobyte LLC (http://www.infobytesec.com/)
55
66 '''
77 from builtins import str
8
9 from tests.utils.url import v2_to_v3
108
119 """Generic tests for APIs NOT prefixed with a workspace_name"""
1210
1614 API_PREFIX = '/v2/'
1715 OBJECT_COUNT = 5
1816
17
1918 @pytest.mark.usefixtures('logged_user')
2019 class GenericAPITest:
21
2220 model = None
2321 factory = None
2422 api_endpoint = None
145143
146144 @pytest.mark.parametrize("method", ["PUT", "PATCH"])
147145 def test_update_an_object(self, test_client, logged_user, method):
148 super(PatchableTestsMixin, self).test_update_an_object(test_client, logged_user, method)
146 super().test_update_an_object(test_client, logged_user, method)
149147
150148 @pytest.mark.parametrize("method", ["PUT", "PATCH"])
151149 def test_update_fails_with_existing(self, test_client, session, method):
152 super(PatchableTestsMixin, self).test_update_fails_with_existing(test_client, session, method)
150 super().test_update_fails_with_existing(test_client, session, method)
153151
154152 def test_patch_update_an_object_does_not_fail_with_partial_data(self, test_client, logged_user):
155153 """To do this the user should use a PATCH request"""
0 #-*- coding: utf8 -*-
0 # -*- coding: utf8 -*-
11 '''
22 Faraday Penetration Test IDE
33 Copyright (C) 2013 Infobyte LLC (http://www.infobytesec.com/)
1414 except ImportError as e:
1515 from urllib.parse import urlencode
1616
17
1718 def with_0_and_n_objects(n=10):
1819 return pytest.mark.parametrize('object_count', [0, n])
20
1921
2022 class PaginationTestsMixin:
2123
0 import pytest
1
20 from tests.test_api_non_workspaced_base import GenericAPITest
31 from tests.factories import UserFactory
42 from faraday.server.models import User
64 from tests.utils.url import v2_to_v3
75
86
9 pytest.fixture('logged_user')
7 # pytest.fixture('logged_user')
108 class TestPreferences(GenericAPITest):
119 model = User
1210 factory = UserFactory
3735 assert response.status_code == 200
3836 assert response.json['preferences'] == preferences
3937
40
4138 def test_add_invalid_preference(self, test_client):
4239 preferences = {'field1': 1, 'field2': 'str1'}
4340 data = {'p': preferences}
4845
4946 class TestPreferencesV3(TestPreferences):
5047 def url(self, obj=None):
51 return v2_to_v3(super(TestPreferencesV3, self).url(obj))
48 return v2_to_v3(super().url(obj))
0 #-*- coding: utf8 -*-
0 # -*- coding: utf8 -*-
11 '''
22 Faraday Penetration Test IDE
33 Copyright (C) 2013 Infobyte LLC (http://www.infobytesec.com/)
88
99 import pytest
1010
11 from tests.factories import SearchFilterFactory, UserFactory, SubFactory
12 from tests.test_api_non_workspaced_base import ReadWriteAPITests, OBJECT_COUNT, PatchableTestsMixin
13 from tests.test_api_agent import logout, http_req
11 from tests.factories import SearchFilterFactory, UserFactory
12 from tests.test_api_non_workspaced_base import ReadWriteAPITests, PatchableTestsMixin
13 from tests.test_api_agent import logout
1414 from tests.conftest import login_as
1515 from faraday.server.models import SearchFilter
16
1716
1817 from faraday.server.api.modules.search_filter import SearchFilterView
1918 from tests.utils.url import v2_to_v3
3231 def test_list_retrieves_all_items_from(self, test_client, logged_user):
3332 for searchfilter in SearchFilter.query.all():
3433 searchfilter.creator = logged_user
35 super(TestSearchFilterAPI, self).test_list_retrieves_all_items_from(test_client, logged_user)
34 super().test_list_retrieves_all_items_from(test_client, logged_user)
3635
3736 def test_list_retrieves_all_items_from_logger_user(self, test_client, session, logged_user):
3837 user_filter = SearchFilterFactory.create(creator=logged_user)
4746
4847 def test_retrieve_one_object(self, test_client, logged_user):
4948 self.first_object.creator = logged_user
50 super(TestSearchFilterAPI, self).test_retrieve_one_object(test_client, logged_user)
49 super().test_retrieve_one_object(test_client, logged_user)
5150
5251 def test_retrieve_one_object_from_logged_user(self, test_client, session, logged_user):
5352
107106 @pytest.mark.parametrize("method", ["PUT"])
108107 def test_update_an_object(self, test_client, logged_user, method):
109108 self.first_object.creator = logged_user
110 super(TestSearchFilterAPI, self).test_update_an_object(test_client, logged_user, method)
109 super().test_update_an_object(test_client, logged_user, method)
111110
112111 def test_update_an_object_fails_with_empty_dict(self, test_client, logged_user):
113112 self.first_object.creator = logged_user
114 super(TestSearchFilterAPI, self).test_update_an_object_fails_with_empty_dict(test_client, logged_user)
113 super().test_update_an_object_fails_with_empty_dict(test_client, logged_user)
115114
116115 def test_delete(self, test_client, logged_user):
117116 self.first_object.creator = logged_user
118 super(TestSearchFilterAPI, self).test_delete(test_client, logged_user)
117 super().test_delete(test_client, logged_user)
119118
120119
121120 @pytest.mark.usefixtures('logged_user')
122121 class TestSearchFilterAPIV3(TestSearchFilterAPI, PatchableTestsMixin):
123122 def url(self, obj=None):
124 return v2_to_v3(super(TestSearchFilterAPIV3, self).url(obj))
123 return v2_to_v3(super().url(obj))
125124
126125 @pytest.mark.parametrize("method", ["PUT", "PATCH"])
127126 def test_update_an_object(self, test_client, logged_user, method):
128 super(TestSearchFilterAPIV3, self).test_update_an_object(test_client, logged_user, method)
127 super().test_update_an_object(test_client, logged_user, method)
129128
130129 def test_patch_update_an_object_does_not_fail_with_partial_data(self, test_client, logged_user):
131130 self.first_object.creator = logged_user
132 super(TestSearchFilterAPIV3, self).test_update_an_object_fails_with_empty_dict(test_client, logged_user)
131 super().test_update_an_object_fails_with_empty_dict(test_client, logged_user)
269269 assert cmd_obj.object_type == 'service'
270270 assert cmd_obj.object_id == res.json['id']
271271
272
273272 def test_create_service_without_ost(self, test_client, host, session):
274273 session.commit()
275274 data = {
343342 view_class = ServiceV3View
344343
345344 def url(self, obj=None, workspace=None):
346 return v2_to_v3(super(TestListServiceViewV3, self).url(obj, workspace))
345 return v2_to_v3(super().url(obj, workspace))
347346
348347 @pytest.mark.parametrize("method", ["PUT", "PATCH"])
349348 def test_update_cant_change_id(self, test_client, session, method):
350 super(TestListServiceViewV3, self).test_update_cant_change_id(test_client, session, method)
349 super().test_update_cant_change_id(test_client, session, method)
0 #-*- coding: utf8 -*-
0 # -*- coding: utf8 -*-
11 '''
22 Faraday Penetration Test IDE
33 Copyright (C) 2013 Infobyte LLC (http://www.infobytesec.com/)
2222 except ImportError:
2323 from urllib.parse import urlencode
2424
25
2625 import pytz
2726 import pytest
2827 from dateutil import parser
3029
3130 from hypothesis import given, settings, strategies as st
3231
33 from sqlalchemy import inspect
3432 from faraday.server.api.modules.vulns import (
3533 VulnerabilityFilterSet,
3634 VulnerabilitySchema,
7270 CustomFieldsSchemaFactory
7371 )
7472
73
7574 def _create_post_data_vulnerability(name, vuln_type, parent_id,
7675 parent_type, refs, policyviolations,
7776 status='opened',
7877 attachments=None, impact=None,
7978 description='desc1234',
8079 confirmed=True, data='data1234',
81 easeofresolution=
82 Vulnerability.EASE_OF_RESOLUTIONS[0],
80 easeofresolution=Vulnerability.EASE_OF_RESOLUTIONS[0],
8381 owned=False, resolution='res1234',
8482 severity='critical',
8583 update_controller_action='UI Web',
159157 model = Vulnerability
160158 factory = factories.VulnerabilityFactory
161159 api_endpoint = 'vulns'
162 #unique_fields = ['ip']
163 #update_fields = ['ip', 'description', 'os']
160 # unique_fields = ['ip']
161 # update_fields = ['ip', 'description', 'os']
164162 view_class = VulnerabilityView
165163 patchable_fields = ['description']
166164
563561 impact = {"accountability": False, "availability": False, "confidentiality": False, "integrity": False}
564562
565563 raw_data = {
566 "_id":"e1b45f5375facfb1435d37e182ebc22de5f77bb3.e05df1c85617fffb575d2ced2679e9a0ebda7c3e",
567 "metadata":{
568 "update_time":1509045001.279,
569 "update_user":"",
570 "update_action":0,
571 "creator":"UI Web",
572 "create_time":1509045001.279,
564 "_id": "e1b45f5375facfb1435d37e182ebc22de5f77bb3.e05df1c85617fffb575d2ced2679e9a0ebda7c3e",
565 "metadata": {
566 "update_time": 1509045001.279,
567 "update_user": "",
568 "update_action": 0,
569 "creator": "UI Web",
570 "create_time": 1509045001.279,
573571 "update_controller_action":
574572 "UI Web New",
575 "owner":""},
576 "obj_id":"e05df1c85617fffb575d2ced2679e9a0ebda7c3e",
577 "owner":"",
573 "owner": ""},
574 "obj_id": "e05df1c85617fffb575d2ced2679e9a0ebda7c3e",
575 "owner": "",
578576 "parent": parent,
579 "type":"Vulnerability",
580 "ws":"cloud",
577 "type": "Vulnerability",
578 "ws": "cloud",
581579 "confirmed": True,
582 "data":"",
580 "data": "",
583581 "desc": desc,
584 "easeofresolution":None,
582 "easeofresolution": None,
585583 "impact": impact,
586584 "name": name,
587585 "owned": False,
588 "policyviolations":policy_violations,
586 "policyviolations": policy_violations,
589587 "refs": refs,
590 "resolution":"",
588 "resolution": "",
591589 "severity": "critical",
592590 "status": status,
593 "_attachments":{},
594 "description":"",
591 "_attachments": {},
592 "description": "",
595593 "parent_type": parent_type,
596 "protocol":"",
597 "version":""}
594 "protocol": "",
595 "version": ""}
598596
599597 if attachments:
600598 raw_data['_attachments'] = {}
601599 for attachment in attachments:
602600 raw_data['_attachments'][attachment.name] = {
603 "content_type": "application/x-shellscript",
604 "data": b64encode(attachment.read()).decode()
605 }
601 "content_type": "application/x-shellscript",
602 "data": b64encode(attachment.read()).decode()
603 }
606604
607605 return raw_data
608606
609607 def test_update_vuln_from_open_to_close(self, test_client, session, host_with_hostnames):
610 vuln = self.factory.create(status='open', host=host_with_hostnames, service=None, workspace=host_with_hostnames.workspace)
608 vuln = self.factory.create(status='open', host=host_with_hostnames, service=None,
609 workspace=host_with_hostnames.workspace)
611610 session.commit()
612611 raw_data = self._create_put_data(
613612 name='New name',
627626 assert res.json['desc'] == 'New desc'
628627
629628 def test_update_vuln_from_correct_type_to_incorrect(self, test_client, session, host_with_hostnames):
630 vuln = self.factory.create(status='open', host=host_with_hostnames, service=None, workspace=host_with_hostnames.workspace)
629 vuln = self.factory.create(status='open', host=host_with_hostnames, service=None,
630 workspace=host_with_hostnames.workspace)
631631 session.commit()
632632 raw_data = self._create_put_data(
633633 name='New name',
641641 raw_data['type'] = "ASDADADASD"
642642 vuln_count_previous = session.query(Vulnerability).count()
643643 res = test_client.put(self.url(vuln), data=raw_data)
644 assert res.status_code in [400,409]
644 assert res.status_code in [400, 409]
645645 assert vuln_count_previous == session.query(Vulnerability).count()
646646
647647 def test_create_vuln_web(self, host_with_hostnames, test_client, session):
670670 assert res.json['method'] == 'GET'
671671 assert res.json['path'] == '/pepep'
672672
673
674
675673 @pytest.mark.parametrize('param_name', ['query', 'query_string'])
676674 @pytest.mark.usefixtures('mock_envelope_list')
677675 def test_filter_by_querystring(
701699 for vuln in res.json['data']:
702700 assert vuln['query'] == 'bbb'
703701 assert set(vuln['_id'] for vuln in res.json['data']) == expected_ids
704
705702
706703 @pytest.mark.usefixtures('mock_envelope_list')
707704 @pytest.mark.parametrize('medium_name', ['medium', 'med'])
753750 # Vulns that shouldn't be shown
754751 vuln_second_workspace = vulnerability_factory.create_batch(5, workspace=second_workspace)
755752 more_vuln_second_workspace = vulnerability_web_factory.create_batch(5, workspace=second_workspace,
756 method='POSTT')
753 method='POSTT')
757754
758755 # Vulns that must be shown
759756 expected_vulns = vulnerability_web_factory.create_batch(
839836 assert set(vuln['_id'] for vuln in res.json['data']) == expected_ids
840837
841838 @pytest.mark.usefixtures('ignore_nplusone')
842 def test_filter_restless_by_target(self, test_client, session, workspace, host_factory):
843
844 host_factory.create(workspace=workspace, ip="192.168.0.1")
845 host_factory.create(workspace=workspace, ip="192.168.0.2")
846
847 session.commit()
848 res = test_client.get(self.check_url(urljoin(
849 self.url(), 'filter?q={"filters":[{"name": "target", "op":"eq", "val":"192.168.0.2"}]}'
850 )))
851 assert res.status_code == 200
852
853 @pytest.mark.usefixtures('ignore_nplusone')
854 def test_filter_restless_by_target_host_ip(self, test_client, session, workspace,
855 host_factory, vulnerability_factory):
839 def test_filter_restless_by_target__(self, test_client, session, workspace, host_factory, vulnerability_factory):
856840
857841 Vulnerability.query.delete()
858842 host = host_factory.create(workspace=workspace, ip="192.168.0.2")
865849
866850 session.commit()
867851 res = test_client.get(self.check_url(urljoin(
852 self.url(), 'filter?q={"filters":[{"name": "target", "op":"eq", "val":"192.168.0.1"}]}'
853 )))
854
855 assert res.status_code == 200
856 assert len(res.json['vulnerabilities']) == 10
857
858 @pytest.mark.usefixtures('ignore_nplusone')
859 def test_filter_restless_by_target_host_ip(self, test_client, session, workspace,
860 host_factory, vulnerability_factory):
861
862 Vulnerability.query.delete()
863 host = host_factory.create(workspace=workspace, ip="192.168.0.2")
864 host_vulns = vulnerability_factory.create_batch(
865 1, workspace=self.workspace, host=host, service=None)
866
867 host2 = host_factory.create(workspace=workspace, ip="192.168.0.1")
868 host_vulns2 = vulnerability_factory.create_batch(
869 10, workspace=self.workspace, host=host2, service=None)
870
871 session.commit()
872 res = test_client.get(self.check_url(urljoin(
868873 self.url(),
869874 'filter?q={"filters":[{"name": "target_host_ip", "op":"eq", "val":"192.168.0.2"}]}'
870875 )))
872877 assert len(res.json['vulnerabilities']) == 1
873878 assert res.json['vulnerabilities'][0]['value']['target'] == '192.168.0.2'
874879
875
876880 @pytest.mark.usefixtures('ignore_nplusone')
877881 def test_filter_restless_by_service_port(self, test_client, session, workspace,
878 host_factory, vulnerability_factory,
879 vulnerability_web_factory, service_factory):
882 host_factory, vulnerability_factory,
883 vulnerability_web_factory, service_factory):
880884
881885 service = service_factory.create(port=9098, name="ssh", workspace=self.workspace)
882886 vulns = vulnerability_factory.create_batch(
883887 1, workspace=self.workspace, service=service, host=None)
884
885888
886889 service = service_factory.create(port=8956, name="443", workspace=self.workspace)
887890
899902
900903 @pytest.mark.usefixtures('ignore_nplusone')
901904 def test_filter_restless_by_service_name(self, test_client, session, workspace,
902 host_factory, vulnerability_factory,
903 vulnerability_web_factory, service_factory):
905 host_factory, vulnerability_factory,
906 vulnerability_web_factory, service_factory):
904907
905908 service = service_factory.create(port=9098, name="ssh", workspace=self.workspace)
906909 vulns = vulnerability_factory.create_batch(
907910 1, workspace=self.workspace, service=service, host=None)
908
909911
910912 service = service_factory.create(port=8956, name="443", workspace=self.workspace)
911913
937939 method=method)
938940
939941 session.commit()
940 res = test_client.get(self.url(workspace=second_workspace) +
941 '?sort=method&sort_dir=asc')
942 res = test_client.get(self.url(workspace=second_workspace)
943 + '?sort=method&sort_dir=asc')
942944 assert res.status_code == 200, res.data
943945 assert len(res.json['data']) == 30
944946 assert ''.join(v['method'] for v in res.json['data']
945947 if v['method']) == 'abcdefghij'
946948
947 res = test_client.get(self.url(workspace=second_workspace) +
948 '?sort=method&sort_dir=desc')
949 res = test_client.get(self.url(workspace=second_workspace)
950 + '?sort=method&sort_dir=desc')
949951 assert res.status_code == 200, res.data
950952 assert len(res.json['data']) == 30
951953 assert ''.join(v['method'] for v in res.json['data']
10341036 assert vuln_count_previous + 1 == session.query(Vulnerability).count()
10351037 assert res.json['name'] == 'New vulns'
10361038 assert res.json['impact'] == {u'accountability': True,
1037 u'availability': True,
1038 u'confidentiality': True,
1039 u'integrity': True}
1039 u'availability': True,
1040 u'confidentiality': True,
1041 u'integrity': True}
10401042
10411043 def test_handles_invalid_impact(self, host_with_hostnames, test_client,
10421044 session):
11841186 session.add(vuln)
11851187 session.commit()
11861188
1187 #Desc
1188 res = test_client.get(
1189 self.check_url(urljoin(self.url(), "count/")) +
1190 "?confirmed=1&group_by=severity&order=sc"
1189 # Desc
1190 res = test_client.get(
1191 self.check_url(urljoin(self.url(), "count/"))
1192 + "?confirmed=1&group_by=severity&order=sc"
11911193 )
11921194 assert res.status_code == 400
11931195
1194 #Asc
1195 res = test_client.get(
1196 self.check_url(urljoin(self.url(), "count/")) +
1197 "?confirmed=1&group_by=severity&order=name,asc"
1196 # Asc
1197 res = test_client.get(
1198 self.check_url(urljoin(self.url(), "count/"))
1199 + "?confirmed=1&group_by=severity&order=name,asc"
11981200 )
11991201 assert res.status_code == 400
1200
12011202
12021203 def test_count_order_by(self, test_client, session):
12031204 for i, vuln in enumerate(self.objects[:3]):
12111212 session.add(vuln)
12121213 session.commit()
12131214
1214 #Desc
1215 res = test_client.get(
1216 self.check_url(urljoin(self.url(),"count/")) + "?confirmed=1&group_by=severity&order=desc"
1215 # Desc
1216 res = test_client.get(
1217 self.check_url(urljoin(self.url(), "count/")) + "?confirmed=1&group_by=severity&order=desc"
12171218 )
12181219 assert res.status_code == 200
12191220 assert res.json['total_count'] == 3
1220 assert sorted(res.json['groups'], key=lambda i: (i['name'],i['count'],i['severity'])) == sorted([
1221 assert sorted(res.json['groups'], key=lambda i: (i['name'], i['count'], i['severity'])) == sorted([
12211222 {"name": "high", "severity": "high", "count": 2},
12221223 {"name": "critical", "severity": "critical", "count": 1},
1223 ], key=lambda i: (i['name'],i['count'],i['severity']))
1224
1225 #Asc
1226 res = test_client.get(self.check_url(urljoin(self.url(),"count/"))+"?confirmed=1&group_by=severity&order=asc")
1224 ], key=lambda i: (i['name'], i['count'], i['severity']))
1225
1226 # Asc
1227 res = test_client.get(
1228 self.check_url(urljoin(self.url(), "count/")) + "?confirmed=1&group_by=severity&order=asc")
12271229 assert res.status_code == 200
12281230 assert res.json['total_count'] == 3
1229 assert sorted(res.json['groups'], key=lambda i: (i['name'],i['count'],i['severity']), reverse=True) == sorted([
1230 {"name": "critical", "severity": "critical", "count": 1},
1231 {"name": "high", "severity": "high", "count": 2},
1232 ], key=lambda i: (i['name'],i['count'],i['severity']), reverse=True)
1231 assert sorted(res.json['groups'], key=lambda i: (i['name'], i['count'], i['severity']), reverse=True) == sorted(
1232 [
1233 {"name": "critical", "severity": "critical", "count": 1},
1234 {"name": "high", "severity": "high", "count": 2},
1235 ], key=lambda i: (i['name'], i['count'], i['severity']), reverse=True)
12331236
12341237 def test_count_group_by_incorrect_vuln_column(self, test_client, session):
12351238 for i, vuln in enumerate(self.objects[:3]):
12431246 session.add(vuln)
12441247 session.commit()
12451248
1246 res = test_client.get(self.check_url(urljoin(self.url(),"count/")) + "?confirmed=1&group_by=username")
1249 res = test_client.get(self.check_url(urljoin(self.url(), "count/")) + "?confirmed=1&group_by=username")
12471250 assert res.status_code == 400
12481251
1249 res = test_client.get(self.check_url(urljoin(self.url(),"count/")) + "?confirmed=1&group_by=")
1252 res = test_client.get(self.check_url(urljoin(self.url(), "count/")) + "?confirmed=1&group_by=")
12501253 assert res.status_code == 400
12511254
12521255 def test_count_confirmed(self, test_client, session):
12621265 session.add(vuln)
12631266 session.commit()
12641267
1265 res = test_client.get(self.check_url(urljoin(self.url(),'count/')) + '?confirmed=1&group_by=severity')
1268 res = test_client.get(self.check_url(urljoin(self.url(), 'count/')) + '?confirmed=1&group_by=severity')
12661269 assert res.status_code == 200
12671270 assert res.json['total_count'] == 3
1268 assert sorted(res.json['groups'], key=lambda i: (i['count'],i['name'],i['severity'])) == sorted([
1271 assert sorted(res.json['groups'], key=lambda i: (i['count'], i['name'], i['severity'])) == sorted([
12691272 {"name": "high", "severity": "high", "count": 2},
12701273 {"name": "critical", "severity": "critical", "count": 1},
1271 ], key=lambda i: (i['count'],i['name'],i['severity']))
1274 ], key=lambda i: (i['count'], i['name'], i['severity']))
12721275
12731276 def test_count_severity_map(self, test_client, second_workspace, session):
12741277 vulns = self.factory.create_batch(4, severity='informational',
1275 workspace=second_workspace)
1278 workspace=second_workspace)
12761279 vulns += self.factory.create_batch(3, severity='medium',
1277 workspace=second_workspace)
1280 workspace=second_workspace)
12781281 vulns += self.factory.create_batch(2, severity='low',
1279 workspace=second_workspace)
1282 workspace=second_workspace)
12801283 session.add_all(vulns)
12811284 session.commit()
12821285
12831286 res = test_client.get(
1284 self.check_url(urljoin(self.url(workspace=second_workspace),'count/')) + '?group_by=severity'
1287 self.check_url(urljoin(self.url(workspace=second_workspace), 'count/')) + '?group_by=severity'
12851288 )
12861289 assert res.status_code == 200
12871290 assert res.json['total_count'] == 9
1288 assert sorted(res.json['groups'], key=lambda i: (i['count'],i['name'],i['severity'])) == sorted([
1291 assert sorted(res.json['groups'], key=lambda i: (i['count'], i['name'], i['severity'])) == sorted([
12891292 {"name": "med", "severity": "med", "count": 3},
12901293 {"name": "low", "severity": "low", "count": 2},
12911294 {"name": "info", "severity": "info", "count": 4},
1292 ], key=lambda i: (i['count'],i['name'],i['severity']))
1295 ], key=lambda i: (i['count'], i['name'], i['severity']))
12931296
12941297 def test_count_multiworkspace_one_workspace(self, test_client, session):
12951298 for i, vuln in enumerate(self.objects):
13041307 session.commit()
13051308
13061309 res = test_client.get(
1307 self.check_url(urljoin(self.url(), 'count_multi_workspace/')) +
1308 f'?workspaces={self.workspace.name}&confirmed=1&group_by=severity&order=desc'
1310 self.check_url(urljoin(self.url(), 'count_multi_workspace/'))
1311 + f'?workspaces={self.workspace.name}&confirmed=1&group_by=severity&order=desc'
13091312 )
13101313
13111314 assert res.status_code == 200
13141317
13151318 def test_count_multiworkspace_two_public_workspaces(self, test_client, session, second_workspace):
13161319 vulns = self.factory.create_batch(1, severity='informational',
1317 workspace=second_workspace)
1320 workspace=second_workspace)
13181321 vulns += self.factory.create_batch(3, severity='medium',
1319 workspace=second_workspace)
1322 workspace=second_workspace)
13201323 vulns += self.factory.create_batch(1, severity='low',
1321 workspace=second_workspace)
1324 workspace=second_workspace)
13221325 session.add_all(vulns)
13231326 session.commit()
13241327
13341337 session.commit()
13351338
13361339 res = test_client.get(
1337 self.check_url(urljoin(self.url(), 'count_multi_workspace/')) +
1338 f'?workspaces={self.workspace.name},{second_workspace.name}&confirmed=1&group_by=severity&order=desc'
1340 self.check_url(urljoin(self.url(), 'count_multi_workspace/'))
1341 + f'?workspaces={self.workspace.name},{second_workspace.name}&confirmed=1&group_by=severity&order=desc'
13391342 )
13401343
13411344 assert res.status_code == 200
13441347
13451348 def test_count_multiworkspace_no_workspace_param(self, test_client):
13461349 res = test_client.get(
1347 self.check_url(urljoin(self.url(), 'count_multi_workspace/')) +
1348 '?confirmed=1&group_by=severity&order=desc'
1350 self.check_url(urljoin(self.url(), 'count_multi_workspace/'))
1351 + '?confirmed=1&group_by=severity&order=desc'
13491352 )
13501353 assert res.status_code == 400
13511354
13521355 def test_count_multiworkspace_no_groupby_param(self, test_client):
13531356 res = test_client.get(
1354 self.check_url(urljoin(self.url(), 'count_multi_workspace/')) +
1355 f'?workspaces={self.workspace.name}&confirmed=1&order=desc'
1357 self.check_url(urljoin(self.url(), 'count_multi_workspace/'))
1358 + f'?workspaces={self.workspace.name}&confirmed=1&order=desc'
13561359 )
13571360 assert res.status_code == 400
13581361
13591362 def test_count_multiworkspace_nonexistent_ws(self, test_client):
13601363 res = test_client.get(
1361 self.check_url(urljoin(self.url(), 'count_multi_workspace/')) +
1362 '?workspaces=asdf,{self.workspace.name}&confirmed=1&group_by=severity&order=desc'
1364 self.check_url(urljoin(self.url(), 'count_multi_workspace/'))
1365 + '?workspaces=asdf,{self.workspace.name}&confirmed=1&group_by=severity&order=desc'
13631366 )
13641367 assert res.status_code == 404
13651368
14431446 workspace=second_workspace
14441447 )
14451448 for high_vuln in high_vulns:
1446
14471449 CommandObjectFactory.create(
14481450 command=command,
14491451 object_type='vulnerability',
14511453 workspace=second_workspace
14521454 )
14531455 for high_vuln_web in high_vulns_web:
1454
14551456 CommandObjectFactory.create(
14561457 command=web_command,
14571458 object_type='vulnerability',
15251526 res = test_client.get(self.url())
15261527 assert res.status_code == 200
15271528 from_json_vuln = list(filter(lambda raw_vuln: raw_vuln['id'] == vuln.id,
1528 res.json['vulnerabilities']))
1529 res.json['vulnerabilities']))
15291530 assert 'metadata' in from_json_vuln[0]['value']
15301531 expected_metadata = {
15311532 u'command_id': command.id,
17171718 severity='high',
17181719 )
17191720 res = test_client.put(
1720 self.check_url(urljoin(self.url(workspace=host_with_hostnames.workspace), f'{res.json["_id"]}/')) +
1721 f'?command_id={command.id}',
1721 self.check_url(urljoin(self.url(workspace=host_with_hostnames.workspace), f'{res.json["_id"]}/'))
1722 + f'?command_id={command.id}',
17221723 data=raw_data)
17231724 assert res.status_code == 200
17241725
17891790 """
17901791 raw_data = {
17911792 'command_id': None,
1792 'confirmed': False,
1793 'data': None,
1794 'desc': 'pepe',
1795 'description': 'pepe',
1796 'metadata': {
1793 'confirmed': False,
1794 'data': None,
1795 'desc': 'pepe',
1796 'description': 'pepe',
1797 'metadata': {
17971798 'command_id': '',
17981799 'create_time': 1518627247.194113,
17991800 'creator': '',
18021803 'update_controller_action': 'No model controller call',
18031804 'update_time': 1518627247.194114,
18041805 'update_user': ''},
1805 'name': 'vuln1',
1806 'owned': False,
1807 'owner': '',
1808 'parent': '358302',
1809 'parent_type': 'Host',
1810 'policyviolations': [],
1811 'refs': [],
1812 'resolution': '',
1813 'severity': 'critical',
1814 'status': 'opened',
1815 'type': 'Vulnerability'
1806 'name': 'vuln1',
1807 'owned': False,
1808 'owner': '',
1809 'parent': '358302',
1810 'parent_type': 'Host',
1811 'policyviolations': [],
1812 'refs': [],
1813 'resolution': '',
1814 'severity': 'critical',
1815 'status': 'opened',
1816 'type': 'Vulnerability'
18161817 }
18171818
18181819 res = test_client.post(self.url(), data=raw_data)
18891890 assert res.json['vulnerabilities'][0]['value']['name'] == vuln.name
18901891
18911892 def test_hostnames_comma_separated(self, test_client, session):
1892 #Create Host A with hostname HA
1893 # Create Host A with hostname HA
18931894 hostnameA = HostnameFactory.create()
18941895 hostnameA.host.workspace = hostnameA.workspace
1895 #Create Host B with hostname HB
1896 # Create Host B with hostname HB
18961897 hostnameB = HostnameFactory.create(workspace=hostnameA.workspace)
18971898 hostnameB.host.workspace = hostnameA.workspace
1898 #Create Vuln with Host A
1899 # Create Vuln with Host A
18991900 vuln = VulnerabilityFactory.create(host=hostnameA.host, workspace=hostnameA.workspace)
1900 #Create Vuln with Host B
1901 # Create Vuln with Host B
19011902 vuln2 = VulnerabilityFactory.create(host=hostnameB.host, workspace=hostnameA.workspace)
19021903 session.add(hostnameA)
19031904 session.add(hostnameB)
19051906 session.add(vuln2)
19061907 session.commit()
19071908
1908 #Search with hosnames=HA,HB
1909 # Search with hosnames=HA,HB
19091910 res = test_client.get(self.url(workspace=vuln.workspace) + f'?hostname={hostnameA},{hostnameB}')
19101911 assert res.status_code == 200
19111912 assert res.json['count'] == 2
19171918 host = HostFactory.create(workspace=self.workspace)
19181919 session.commit()
19191920 data = {
1920 'name': 'Test Alert policy_violations',
1921 'severity': 'informational',
1922 'creator': 'Zap',
1923 'parent_type': 'Host',
1924 'parent': host.id,
1925 'type': 'Vulnerability',
1921 'name': 'Test Alert policy_violations',
1922 'severity': 'informational',
1923 'creator': 'Zap',
1924 'parent_type': 'Host',
1925 'parent': host.id,
1926 'type': 'Vulnerability',
19261927 }
19271928 res = test_client.post(self.url(), data=data)
19281929 assert res.status_code == 201
19341935 host = HostFactory.create(workspace=self.workspace)
19351936 session.commit()
19361937 data = {
1937 'name': 'Test Alert policy_violations',
1938 'severity': 'informational',
1939 'creator': 'Zap',
1940 'parent_type': 'Host',
1941 'parent': host.id,
1942 'type': 'Vulnerability',
1938 'name': 'Test Alert policy_violations',
1939 'severity': 'informational',
1940 'creator': 'Zap',
1941 'parent_type': 'Host',
1942 'parent': host.id,
1943 'type': 'Vulnerability',
19431944 }
19441945 res = test_client.post(self.url(), data=data)
19451946 assert res.status_code == 201
19981999 assert query_test == []
19992000
20002001 def test_delete_attachment_from_vuln(self, test_client, session, host_with_hostnames):
2001 session.commit() # flush host_with_hostnames
2002 session.commit() # flush host_with_hostnames
20022003 ws_name = host_with_hostnames.workspace.name
20032004 attachment = NamedTemporaryFile()
20042005 file_content = b'test file'
20272028 assert query_test == []
20282029
20292030 def test_delete_attachment_from_vuln_fails_readonly(self, test_client, session, host_with_hostnames):
2030 session.commit() # flush host_with_hostnames
2031 session.commit() # flush host_with_hostnames
20312032 ws_name = host_with_hostnames.workspace.name
20322033 attachment = NamedTemporaryFile()
20332034 file_content = b'test file'
20862087
20872088 def test_invalid_vuln_filters(self, test_client, session, workspace):
20882089 data = {
2089 "q": {"filters":[{"name":"severity","op":"eq","val":"medium"}]}
2090 "q": {"filters": [{"name": "severity", "op": "eq", "val": "medium"}]}
20902091 }
20912092 res = test_client.get(self.check_url(f'/v2/ws/{workspace.name}/vulns/filter'), query_string=data)
20922093 assert res.status_code == 400
21062107 workspace = WorkspaceFactory.create()
21072108 creator = UserFactory.create()
21082109 vuln = VulnerabilityFactory.create(
2109 workspace=workspace,
2110 severity="medium",
2111 creator=creator,
2110 workspace=workspace,
2111 severity="medium",
2112 creator=creator,
21122113 )
21132114 vuln2 = VulnerabilityFactory.create(
2114 workspace=workspace,
2115 severity="medium",
2116 creator=creator,
2115 workspace=workspace,
2116 severity="medium",
2117 creator=creator,
21172118 )
21182119 session.add(vuln)
21192120 session.add(vuln2)
21202121 session.commit()
21212122 data = {
2122 'q': '{"group_by":[{"field":"creator_id"}]}'
2123 'q': '{"group_by":[{"field":"creator_id"}]}'
21232124 }
21242125 res = test_client.get(self.check_url(f'/v2/ws/{workspace.name}/vulns/filter'), query_string=data)
21252126 assert res.status_code == 200
2126 assert res.json['count'] == 1 # all vulns created by the same creator
2127 assert res.json['count'] == 1 # all vulns created by the same creator
21272128 expected = [{'count': 2, 'creator_id': creator.id}]
21282129 assert [vuln['value'] for vuln in res.json['vulnerabilities']] == expected
21292130
21312132 workspace = WorkspaceFactory.create()
21322133 creator = UserFactory.create()
21332134 vuln = VulnerabilityFactory.create_batch(size=10,
2134 workspace=workspace,
2135 severity="critical",
2136 creator=creator,
2137 )
2135 workspace=workspace,
2136 severity="critical",
2137 creator=creator,
2138 )
21382139 vuln2 = VulnerabilityWebFactory.create_batch(size=10,
2139 workspace=workspace,
2140 severity="critical",
2141 creator=creator,
2142 )
2140 workspace=workspace,
2141 severity="critical",
2142 creator=creator,
2143 )
21432144 session.add_all(vuln)
21442145 session.add_all(vuln2)
21452146 session.commit()
21462147 data = {
2147 'q': '{"group_by":[{"field":"severity"}]}'
2148 'q': '{"group_by":[{"field":"severity"}]}'
21482149 }
21492150 res = test_client.get(self.check_url(f'/v2/ws/{workspace.name}/vulns/filter'), query_string=data)
21502151 assert res.status_code == 200, res.json
2151 assert res.json['count'] == 1, res.json # all vulns created by the same creator
2152 expected = {
2152 assert res.json['count'] == 1, res.json # all vulns created by the same creator
2153 expected = {
21532154 'count': 1,
21542155 'vulnerabilities': [
21552156 {'id': 0, 'key': 0, 'value': {'count': 20, 'severity': 'critical'}}
21612162 workspace = WorkspaceFactory.create()
21622163 creator = UserFactory.create()
21632164 vuln = VulnerabilityFactory.create_batch(size=10,
2164 name='name 1',
2165 workspace=workspace,
2166 severity="critical",
2167 creator=creator,
2168 )
2165 name='name 1',
2166 workspace=workspace,
2167 severity="critical",
2168 creator=creator,
2169 )
21692170 vuln2 = VulnerabilityWebFactory.create_batch(size=10,
2170 name='name 2',
2171 workspace=workspace,
2172 severity="critical",
2173 creator=creator,
2174 )
2171 name='name 2',
2172 workspace=workspace,
2173 severity="critical",
2174 creator=creator,
2175 )
21752176 session.add_all(vuln)
21762177 session.add_all(vuln2)
21772178 session.commit()
21782179 data = {
2179 'q': '{"group_by":[{"field":"severity"}, {"field": "name"}]}'
2180 'q': '{"group_by":[{"field":"severity"}, {"field": "name"}]}'
21802181 }
21812182 res = test_client.get(self.check_url(f'/v2/ws/{workspace.name}/vulns/filter'), query_string=data)
21822183 assert res.status_code == 200, res.json
2183 assert res.json['count'] == 2, res.json # all vulns created by the same creator
2184 expected ={'vulnerabilities': [
2185 {'id': 0, 'key': 0, 'value': {'count': 10, 'severity': 'critical', 'name': 'name 1'}}, {'id': 1, 'key': 1, 'value': {'count': 10, 'severity': 'critical', 'name': 'name 2'}}], 'count': 2}
2184 assert res.json['count'] == 2, res.json # all vulns created by the same creator
2185 expected = {'vulnerabilities': [
2186 {'id': 0, 'key': 0, 'value': {'count': 10, 'severity': 'critical', 'name': 'name 1'}},
2187 {'id': 1, 'key': 1, 'value': {'count': 10, 'severity': 'critical', 'name': 'name 2'}}], 'count': 2}
21862188
21872189 assert res.json == expected, res.json
21882190
21962198 workspace = WorkspaceFactory.create()
21972199 creator = UserFactory.create()
21982200 vuln = VulnerabilityFactory.create_batch(size=10,
2199 workspace=workspace,
2200 severity="critical",
2201 creator=creator,
2202 )
2201 workspace=workspace,
2202 severity="critical",
2203 creator=creator,
2204 )
22032205 vuln2 = VulnerabilityWebFactory.create_batch(size=10,
2204 workspace=workspace,
2205 severity="critical",
2206 creator=creator,
2207 )
2206 workspace=workspace,
2207 severity="critical",
2208 creator=creator,
2209 )
22082210 session.add_all(vuln)
22092211 session.add_all(vuln2)
22102212 session.commit()
22112213 data = {
2212 'q': json.dumps({"group_by":[{"field":col_name}]})
2214 'q': json.dumps({"group_by": [{"field": col_name}]})
22132215 }
22142216 res = test_client.get(self.check_url(f'/v2/ws/{workspace.name}/vulns/filter'), query_string=data)
22152217 assert res.status_code == 200, res.json
22182220 workspace = WorkspaceFactory.create()
22192221 creator = UserFactory.create()
22202222 vuln = VulnerabilityFactory.create(
2221 name="test",
2222 description="test",
2223 workspace=workspace,
2224 severity="medium",
2225 creator=creator,
2223 name="test",
2224 description="test",
2225 workspace=workspace,
2226 severity="medium",
2227 creator=creator,
22262228 )
22272229 vuln2 = VulnerabilityFactory.create(
2228 name="test",
2229 description="test",
2230 workspace=workspace,
2231 severity="medium",
2232 creator=creator,
2230 name="test",
2231 description="test",
2232 workspace=workspace,
2233 severity="medium",
2234 creator=creator,
22332235 )
22342236 vuln3 = VulnerabilityFactory.create(
2235 name="test2",
2236 description="test",
2237 workspace=workspace,
2238 severity="medium",
2239 creator=creator,
2237 name="test2",
2238 description="test",
2239 workspace=workspace,
2240 severity="medium",
2241 creator=creator,
22402242 )
22412243 session.add(vuln)
22422244 session.add(vuln2)
22432245 session.add(vuln3)
22442246 session.commit()
22452247 data = {
2246 'q': '{"group_by":[{"field":"name"}, {"field":"description"}]}'
2248 'q': '{"group_by":[{"field":"name"}, {"field":"description"}]}'
22472249 }
22482250 res = test_client.get(self.check_url(f'/v2/ws/{workspace.name}/vulns/filter'), query_string=data)
22492251 assert res.status_code == 200
22502252 assert res.json['count'] == 2
2251 expected = [{'count': 2, 'name': 'test', 'description': 'test'}, {'count': 1, 'name': 'test2', 'description': 'test'}]
2253 expected = [{'count': 2, 'name': 'test', 'description': 'test'},
2254 {'count': 1, 'name': 'test2', 'description': 'test'}]
22522255 assert [vuln['value'] for vuln in res.json['vulnerabilities']] == expected
22532256
22542257 def test_vuln_restless_sort_by_(self, test_client, session):
22572260 host2 = HostFactory.create(workspace=workspace)
22582261 creator = UserFactory.create()
22592262 vuln = VulnerabilityFactory.create(
2260 name="test",
2261 description="test",
2262 workspace=workspace,
2263 severity="critical",
2264 creator=creator,
2265 service=None,
2266 host=host,
2263 name="test",
2264 description="test",
2265 workspace=workspace,
2266 severity="critical",
2267 creator=creator,
2268 service=None,
2269 host=host,
22672270 )
22682271 vuln2 = VulnerabilityFactory.create(
2269 name="test 2",
2270 description="test",
2271 workspace=workspace,
2272 severity="critical",
2273 creator=creator,
2274 service=None,
2275 host=host,
2272 name="test 2",
2273 description="test",
2274 workspace=workspace,
2275 severity="critical",
2276 creator=creator,
2277 service=None,
2278 host=host,
22762279 )
22772280 vuln3 = VulnerabilityFactory.create(
2278 name="test 3",
2279 description="test",
2280 workspace=workspace,
2281 severity="low",
2282 creator=creator,
2283 service=None,
2284 host=host,
2281 name="test 3",
2282 description="test",
2283 workspace=workspace,
2284 severity="low",
2285 creator=creator,
2286 service=None,
2287 host=host,
22852288 )
22862289 vulns = VulnerabilityFactory.create_batch(
2287 10,
2288 workspace=workspace,
2289 service=None,
2290 severity="medium",
2291 host=host2,
2290 10,
2291 workspace=workspace,
2292 service=None,
2293 severity="medium",
2294 host=host2,
22922295 )
22932296 session.add(vuln)
22942297 session.add(vuln2)
22952298 session.add(vuln3)
22962299 session.add_all(vulns)
22972300 session.commit()
2298 query = {"order_by":[
2299 {"field":"host__vulnerability_critical_generic_count", "direction": "desc"},
2300 {"field":"host__vulnerability_high_generic_count", "direction": "desc"},
2301 {"field":"host__vulnerability_medium_generic_count", "direction": "desc"},
2301 query = {"order_by": [
2302 {"field": "host__vulnerability_critical_generic_count", "direction": "desc"},
2303 {"field": "host__vulnerability_high_generic_count", "direction": "desc"},
2304 {"field": "host__vulnerability_medium_generic_count", "direction": "desc"},
23022305 ],
2303 "filters": [{"or": [
2304 {"name": "severity", "op": "==", "val": "critical"},
2305 {"name": "severity", "op": "==", "val": "high"},
2306 {"name": "severity", "op": "==", "val": "medium"},
2307 ]}]
2306 "filters": [{"or": [
2307 {"name": "severity", "op": "==", "val": "critical"},
2308 {"name": "severity", "op": "==", "val": "high"},
2309 {"name": "severity", "op": "==", "val": "medium"},
2310 ]}]
23082311 }
23092312
23102313 data = {
2311 'q': json.dumps(query)
2314 'q': json.dumps(query)
23122315 }
23132316 res = test_client.get(self.check_url(f'/v2/ws/{workspace.name}/vulns/filter'), query_string=data)
23142317 assert res.status_code == 200
23222325 session.add(vuln)
23232326 session.commit()
23242327 data = {
2325 'q': json.dumps({"filters":[{"name":"creator","op":"eq","val": vuln.creator.username}]})
2328 'q': json.dumps({"filters": [{"name": "creator", "op": "eq", "val": vuln.creator.username}]})
23262329 }
23272330 res = test_client.get(self.check_url(f'/v2/ws/{workspace.name}/vulns/filter'), query_string=data)
23282331 assert res.status_code == 200
24222425 session.add(confirmed_vulns)
24232426 session.commit()
24242427 res = test_client.get(
2425 self.check_url(urljoin(self.url(workspace=workspace), 'export_csv/')) +
2426 '?q={"filters":[{"name":"confirmed","op":"==","val":"true"}]}'
2428 self.check_url(urljoin(self.url(workspace=workspace), 'export_csv/'))
2429 + '?q={"filters":[{"name":"confirmed","op":"==","val":"true"}]}'
24272430 )
24282431 assert res.status_code == 200
24292432 assert self._verify_csv(res.data, confirmed=True)
24492452 session.add(confirmed_vulns)
24502453 session.commit()
24512454 res = test_client.get(
2452 self.check_url(urljoin(self.url(workspace=workspace), 'export_csv/')) +
2453 '?q={"filters":[{"name":"severity","op":"==","val":"critical"}]}'
2455 self.check_url(urljoin(self.url(workspace=workspace), 'export_csv/'))
2456 + '?q={"filters":[{"name":"severity","op":"==","val":"critical"}]}'
24542457 )
24552458 assert res.status_code == 200
24562459 assert self._verify_csv(res.data, confirmed=True, severity='critical')
24612464 session.add(self.first_object)
24622465 session.commit()
24632466 res = test_client.get(
2464 self.check_url(urljoin(self.url(), 'export_csv/')) +
2465 '?confirmed=true'
2467 self.check_url(urljoin(self.url(), 'export_csv/'))
2468 + '?confirmed=true'
24662469 )
24672470 assert res.status_code == 200
24682471 self._verify_csv(res.data, confirmed=True)
25232526 self.first_object.custom_fields = {"cvss": "9", "invalid": "not shown"}
25242527 # another case witt custom fields as None
25252528 vuln = VulnerabilityFactory.create()
2526 vuln.custom_fields=None
2529 vuln.custom_fields = None
25272530 session.add(vuln)
25282531 session.commit()
25292532
25332536 def _verify_csv(self, raw_csv_data, confirmed=False, severity=None):
25342537 custom_fields = [custom_field.field_name for custom_field in CustomFieldsSchema.query.all()]
25352538 vuln_headers = [
2536 "confirmed", "id", "date", "name", "severity", "service",
2539 "confirmed", "id", "date", "name", "severity", "service",
25372540 "target", "desc", "status", "hostnames", "comments", "owner",
25382541 "os", "resolution", "refs", "easeofresolution", "web_vulnerability",
25392542 "data", "website", "path", "status_code", "request", "response", "method",
25602563 return False
25612564 # test custom fields
25622565 for c_index, custom_field in enumerate(custom_fields):
2563 if vuln.custom_fields[custom_field] != line['cf_'+custom_field]:
2566 if vuln.custom_fields[custom_field] != line['cf_' + custom_field]:
25642567 return False
25652568
2566 #test hosts
2569 # test hosts
25672570 host = Host.query.filter(Host.id == line['host_id']).first()
25682571 if host:
25692572 if host.ip != line['target']:
26042607 view_class = VulnerabilityV3View
26052608
26062609 def url(self, obj=None, workspace=None):
2607 return v2_to_v3(super(TestListVulnerabilityViewV3, self).url(obj, workspace))
2610 return v2_to_v3(super().url(obj, workspace))
26082611
26092612 def check_url(self, url):
26102613 return v2_to_v3(url)
26542657 session.add(custom_field_schema)
26552658 session.commit()
26562659 data = {
2657 'name': 'Test Alert policy_violations',
2658 'severity': 'informational',
2659 'creator': 'Zap',
2660 'parent_type': 'Host',
2661 'parent': host.id,
2662 'type': 'Vulnerability',
2663 'custom_fields': {
2660 'name': 'Test Alert policy_violations',
2661 'severity': 'informational',
2662 'creator': 'Zap',
2663 'parent_type': 'Host',
2664 'parent': host.id,
2665 'type': 'Vulnerability',
2666 'custom_fields': {
26642667 'cvss': '321321',
2665 }
2668 }
26662669 }
26672670 res = test_client.post(self.url(), data=data)
26682671
26692672 assert res.status_code == 201
26702673 assert res.json['custom_fields']['cvss'] == '321321'
26712674
2672 def test_create_vuln_with_custom_fields_using_field_display_name_continues_with_warning(self, test_client, second_workspace, session, caplog):
2675 def test_create_vuln_with_custom_fields_using_field_display_name_continues_with_warning(self, test_client,
2676 second_workspace, session,
2677 caplog):
26732678 host = HostFactory.create(workspace=self.workspace)
26742679 custom_field_schema = CustomFieldsSchemaFactory(
26752680 field_name='cvss',
26812686 session.add(custom_field_schema)
26822687 session.commit()
26832688 data = {
2684 'name': 'Test Alert policy_violations',
2685 'severity': 'informational',
2686 'creator': 'Zap',
2687 'parent_type': 'Host',
2688 'parent': host.id,
2689 'type': 'Vulnerability',
2690 'custom_fields': {
2689 'name': 'Test Alert policy_violations',
2690 'severity': 'informational',
2691 'creator': 'Zap',
2692 'parent_type': 'Host',
2693 'parent': host.id,
2694 'type': 'Vulnerability',
2695 'custom_fields': {
26912696 'CVSS': '321321', # here we use the field_name and not the display_name
2692 }
2697 }
26932698 }
26942699 res = test_client.post(self.url(), data=data)
26952700
27082713 session.add(custom_field_schema)
27092714 session.commit()
27102715 data = {
2711 'name': 'Test Alert policy_violations',
2712 'severity': 'informational',
2713 'creator': 'Zap',
2714 'parent_type': 'Host',
2715 'parent': host.id,
2716 'type': 'Vulnerability',
2717 'custom_fields': {
2716 'name': 'Test Alert policy_violations',
2717 'severity': 'informational',
2718 'creator': 'Zap',
2719 'parent_type': 'Host',
2720 'parent': host.id,
2721 'type': 'Vulnerability',
2722 'custom_fields': {
27182723 'changes': ['1', '2', '3'],
2719 }
2724 }
27202725 }
27212726 res = test_client.post(self.url(), data=data)
27222727
27352740 session.add(custom_field_schema)
27362741 session.commit()
27372742 data = {
2738 'name': 'Test Alert policy_violations',
2739 'severity': 'informational',
2740 'creator': 'Zap',
2741 'parent_type': 'Host',
2742 'parent': host.id,
2743 'type': 'Vulnerability',
2744 'custom_fields': {
2743 'name': 'Test Alert policy_violations',
2744 'severity': 'informational',
2745 'creator': 'Zap',
2746 'parent_type': 'Host',
2747 'parent': host.id,
2748 'type': 'Vulnerability',
2749 'custom_fields': {
27452750 'cvss': 'pepe',
2746 }
2751 }
27472752 }
27482753 res = test_client.post(self.url(), data=data)
27492754
27502755 assert res.status_code == 400
27512756
2752 def test_create_vuln_with_invalid_custom_fields_continues_with_warning(self, test_client, second_workspace, session, caplog):
2757 def test_create_vuln_with_invalid_custom_fields_continues_with_warning(self, test_client, second_workspace, session,
2758 caplog):
27532759 host = HostFactory.create(workspace=self.workspace)
27542760 session.add(host)
27552761 session.commit()
27562762 data = {
2757 'name': 'Test Alert policy_violations',
2758 'severity': 'informational',
2759 'creator': 'Zap',
2760 'parent_type': 'Host',
2761 'parent': host.id,
2762 'type': 'Vulnerability',
2763 'custom_fields': {
2763 'name': 'Test Alert policy_violations',
2764 'severity': 'informational',
2765 'creator': 'Zap',
2766 'parent_type': 'Host',
2767 'parent': host.id,
2768 'type': 'Vulnerability',
2769 'custom_fields': {
27642770 'CVSS': '321321',
2765 }
2771 }
27662772 }
27672773 res = test_client.post(self.url(), data=data)
27682774
29953001 assert cmd_obj.object_id == res.json['_id']
29963002 assert res.json['tool'] == command.tool
29973003
3004 @pytest.mark.parametrize('refs', [
3005 ('cve', 'CVE-2017-0002'),
3006 ('owasp', 'https://www.owasp.org/index.php/XSS_%28Cross_Site_Scripting%29_Prevention_Cheat_Sheet'),
3007 ('cwe', 'CWE-135'),
3008 ('cvss', 'CVSS v2 Vector(AV:A/AC:M/Au:S/C:P/I:P/A:N)'),
3009 ])
3010 def test_vuln_with_specific_refs(self, host_with_hostnames, test_client, session, refs):
3011 ref_name, ref_example = refs
3012 raw_data_vuln = _create_post_data_vulnerability(
3013 name='New vuln 1',
3014 vuln_type='Vulnerability',
3015 parent_id=host_with_hostnames.id,
3016 parent_type='Host',
3017 refs=[ref_example],
3018 policyviolations=[],
3019 description='helloworld 1',
3020 severity='low',
3021 )
3022
3023 post_response = test_client.post(self.url(workspace=host_with_hostnames.workspace), data=raw_data_vuln)
3024 vuln_1_id = post_response.json['obj_id']
3025 get_response = test_client.get(self.url(workspace=host_with_hostnames.workspace, obj=vuln_1_id))
3026
3027 assert get_response.status_code == 200
3028 assert ref_name in get_response.json
3029 assert 1 == len(get_response.json[ref_name])
3030 assert ref_example == get_response.json[ref_name][0]
3031
29983032
29993033 class TestCustomFieldVulnerabilityV3(TestCustomFieldVulnerability, PatchableTestsMixin):
30003034 view_class = VulnerabilityV3View
30013035
30023036 def url(self, obj=None, workspace=None):
3003 return v2_to_v3(super(TestCustomFieldVulnerabilityV3, self).url(obj, workspace))
3037 return v2_to_v3(super().url(obj, workspace))
30043038
30053039 def check_url(self, url):
30063040 return v2_to_v3(url)
30123046 @pytest.mark.skip(reason="To be reimplemented")
30133047 def test_bulk_delete_vuln_severity(self, host_with_hostnames, test_client, session):
30143048 pass
3015
30163049
30173050
30183051 @pytest.mark.usefixtures('logged_user')
30383071 view_class = VulnerabilityV3View
30393072
30403073 def url(self, obj=None, workspace=None):
3041 return v2_to_v3(super(TestVulnerabilityCustomFieldsV3, self).url(obj, workspace))
3074 return v2_to_v3(super().url(obj, workspace))
30423075
30433076
30443077 @pytest.mark.usefixtures('logged_user')
30583091 session.commit()
30593092
30603093 query_filter = {"filters":
3061 [{"name":"hostnames","op":"eq","val":"pepe"}]
3062 }
3094 [{"name": "hostnames", "op": "eq", "val": "pepe"}]
3095 }
30633096 res = test_client.get(
30643097 self.check_url(f'/v2/ws/{workspace.name}/vulns/filter?q={json.dumps(query_filter)}')
30653098 )
30793112 session.commit()
30803113
30813114 query_filter = {"filters":
3082 [{"name":"hostnames","op":"eq","val":"pepe"}]
3083 }
3115 [{"name": "hostnames", "op": "eq", "val": "pepe"}]
3116 }
30843117 res = test_client.get(
30853118 self.check_url(f'/v2/ws/{workspace.name}/vulns/') + f'?q={json.dumps(query_filter)}'
30863119 )
31223155
31233156 def test_search_code_attribute_bug(self, workspace, test_client, session):
31243157 query_filter = {"filters":
3125 [{"name":"code", "op": "eq", "val": "test"}]
3158 [{"name": "code", "op": "eq", "val": "test"}]
31263159 }
31273160 res = test_client.get(
31283161 self.check_url(f'/v2/ws/{workspace.name}/vulns/filter?q={json.dumps(query_filter)}')
31403173 session.add(host)
31413174 session.commit()
31423175
3143 query_filter = {"filters":[
3144 {"and": [{"name": "hostnames","op": "eq", "val": "pepe"}]}
3176 query_filter = {"filters": [
3177 {"and": [{"name": "hostnames", "op": "eq", "val": "pepe"}]}
31453178 ]}
31463179 res = test_client.get(
31473180 self.check_url(f'/v2/ws/{workspace.name}/vulns/filter?q={json.dumps(query_filter)}')
31563189 workspace = WorkspaceFactory.create()
31573190 host = HostFactory.create(workspace=workspace)
31583191 vulns = VulnerabilityFactory.create_batch(10,
3159 workspace=workspace,
3160 severity='high'
3161 )
3192 workspace=workspace,
3193 severity='high'
3194 )
31623195 session.add_all(vulns)
31633196 web_vulns = VulnerabilityWebFactory.create_batch(10,
3164 workspace=workspace,
3165 severity='high'
3166 )
3197 workspace=workspace,
3198 severity='high'
3199 )
31673200 session.add_all(web_vulns)
31683201 session.add(host)
31693202 session.commit()
31713204 expected_vulns = set([vuln.id for vuln in vulns] + [vuln.id for vuln in web_vulns])
31723205 for offset in range(0, 2):
31733206 query_filter = {
3174 "filters":[{"name":"severity","op":"eq","val":"high"}],
3207 "filters": [{"name": "severity", "op": "eq", "val": "high"}],
31753208 "limit": 10,
31763209 "offset": offset * 10,
31773210 }
31913224 workspace = WorkspaceFactory.create()
31923225 host = HostFactory.create(workspace=workspace)
31933226 vulns = VulnerabilityWebFactory.create_batch(100,
3194 workspace=workspace,
3195 severity='high'
3196 )
3227 workspace=workspace,
3228 severity='high'
3229 )
31973230 session.add_all(vulns)
31983231 session.add(host)
31993232 session.commit()
32013234 expected_vulns = set([vuln.id for vuln in vulns])
32023235 for offset in range(0, 10):
32033236 query_filter = {
3204 "filters":[{"name":"severity","op":"eq","val":"high"}],
3237 "filters": [{"name": "severity", "op": "eq", "val": "high"}],
32053238 "limit": 10,
32063239 "offset": 10 * offset,
32073240 }
32203253 workspace = WorkspaceFactory.create()
32213254 host = HostFactory.create(workspace=workspace)
32223255 vulns = VulnerabilityWebFactory.create_batch(10,
3223 workspace=workspace,
3224 severity='high'
3225 )
3256 workspace=workspace,
3257 severity='high'
3258 )
32263259 session.add_all(vulns)
32273260 vulns = VulnerabilityFactory.create_batch(10,
3228 workspace=workspace,
3229 severity='low'
3230 )
3261 workspace=workspace,
3262 severity='low'
3263 )
32313264 session.add_all(vulns)
32323265 med_vulns = VulnerabilityFactory.create_batch(10,
3233 workspace=workspace,
3234 severity='medium'
3235 )
3266 workspace=workspace,
3267 severity='medium'
3268 )
32363269 session.add_all(med_vulns)
32373270 session.add(host)
32383271 session.commit()
32403273 expected_vulns = set([vuln.id for vuln in med_vulns])
32413274 for offset in range(0, 10):
32423275 query_filter = {
3243 "filters":[{"name":"severity","op":"eq","val":"medium"}],
3244 "limit":"1",
3276 "filters": [{"name": "severity", "op": "eq", "val": "medium"}],
3277 "limit": "1",
32453278 "offset": offset,
32463279 }
32473280 res = test_client.get(
33533386
33543387 @pytest.mark.skip_sql_dialect('sqlite')
33553388 def test_search_hypothesis_test_found_case(self, test_client, session, workspace):
3356 query_filter = {'filters': [{'name': 'host_id', 'op': 'not_in', 'val': '\U0010a1a7\U00093553\U000eb46a\x1e\x10\r\x18%\U0005ddfa0\x05\U000fdeba\x08\x04絮'}]}
3389 query_filter = {'filters': [{'name': 'host_id', 'op': 'not_in',
3390 'val': '\U0010a1a7\U00093553\U000eb46a\x1e\x10\r\x18%\U0005ddfa0\x05\U000fdeba\x08\x04絮'}]}
33573391 res = test_client.get(
33583392 self.check_url(f'/v2/ws/{workspace.name}/vulns/filter?q={json.dumps(query_filter)}')
33593393 )
34053439
34063440 @pytest.mark.skip_sql_dialect('sqlite')
34073441 def test_search_hypothesis_test_found_case_6(self, test_client, session, workspace):
3408 query_filter = {'filters': [{'name': 'resolution', 'op': 'any', 'val': ''}]}
3442 query_filter = {'filters': [{'name': 'resolution', 'op': '==', 'val': ''}]}
34093443 res = test_client.get(
34103444 self.check_url(f'/v2/ws/{workspace.name}/vulns/filter?q={json.dumps(query_filter)}')
34113445 )
34133447
34143448 @pytest.mark.skip_sql_dialect('sqlite')
34153449 def test_search_hypothesis_test_found_case_7(self, test_client, session, workspace):
3416 query_filter = {'filters': [{'name': 'name', 'op': '>', 'val': '\U0004e755\U0007a789\U000e02d1\U000b3d32\x10\U000ad0e2,\x05\x1a'}, {'name': 'creator', 'op': 'eq', 'val': 21883}]}
3450 query_filter = {'filters': [
3451 {'name': 'name', 'op': '>', 'val': '\U0004e755\U0007a789\U000e02d1\U000b3d32\x10\U000ad0e2,\x05\x1a'},
3452 {'name': 'creator', 'op': 'eq', 'val': 21883}]}
34173453 res = test_client.get(
34183454 self.check_url(f'/v2/ws/{workspace.name}/vulns/filter?q={json.dumps(query_filter)}')
34193455 )
34403476
34413477 @pytest.mark.skip_sql_dialect('sqlite')
34423478 def test_search_hypothesis_test_found_case_9(self, test_client, session, workspace):
3443 query_filter = {'filters': [{'name': 'issuetracker', 'op': 'not_equal_to', 'val': '0\x00\U00034383$\x13-\U000375fb\U0007add2\x01\x01\U0010c23a'}]}
3479 query_filter = {'filters': [{'name': 'issuetracker', 'op': 'not_equal_to',
3480 'val': '0\x00\U00034383$\x13-\U000375fb\U0007add2\x01\x01\U0010c23a'}]}
34443481
34453482 res = test_client.get(
34463483 self.check_url(f'/v2/ws/{workspace.name}/vulns/filter?q={json.dumps(query_filter)}')
35133550 session.commit()
35143551 query_filter = {
35153552 "group_by":
3516 [{"field":"severity"}],
3553 [{"field": "severity"}],
35173554 "order_by":
3518 [{"field":"name","direction":"asc"}]
3555 [{"field": "name", "direction": "asc"}]
35193556 }
35203557
35213558 res = test_client.get(
35253562
35263563 @pytest.mark.skip_sql_dialect('sqlite')
35273564 @pytest.mark.parametrize("sort_order", [
3528 {"direction":"asc", "expected": ['a', 'A', 'b', 'B']},
3529 {"direction":"desc", "expected": ['B', 'b', 'A', 'a']}
3565 {"direction": "asc", "expected": ['a', 'A', 'b', 'B']},
3566 {"direction": "desc", "expected": ['B', 'b', 'A', 'a']}
35303567 ])
35313568 def test_filter_order_by_name_directions(self, sort_order, test_client, session, workspace):
35323569 vuln_1 = VulnerabilityWebFactory.create(name='a', workspace=workspace, severity='high')
35383575 session.commit()
35393576 query_filter = {
35403577 "order_by":
3541 [{"field":"name","direction": sort_order["direction"]}],
3578 [{"field": "name", "direction": sort_order["direction"]}],
35423579 "limit": 10,
35433580 "offset": 0
35443581 }
35623599 session.commit()
35633600 query_filter = {
35643601 "order_by":
3565 [{"field":"severity","direction":"asc"}],
3602 [{"field": "severity", "direction": "asc"}],
35663603 "limit": 10,
35673604 "offset": 0
35683605 }
35763613 assert expected_order == [vuln['value']['severity'] for vuln in res.json['vulnerabilities']]
35773614
35783615 def test_filter_by_creator_command_id(self,
3579 test_client,
3580 session,
3581 workspace,
3582 command_object_factory,
3583 empty_command_factory):
3616 test_client,
3617 session,
3618 workspace,
3619 command_object_factory,
3620 empty_command_factory):
35843621
35853622 command = empty_command_factory.create(workspace=workspace,
35863623 tool="metasploit")
36003637 workspace=workspace)
36013638 session.commit()
36023639
3603 query_filter ={
3604 "filters":[{"and":[
3605 {"name":"creator_command_id","op":"==","val":command.id}]
3640 query_filter = {
3641 "filters": [{"and": [
3642 {"name": "creator_command_id", "op": "==", "val": command.id}]
36063643 }],
3607 "offset":0,
3608 "limit":40
3644 "offset": 0,
3645 "limit": 40
36093646 }
36103647
36113648 res = test_client.get(
36183655 class TestVulnerabilitySearchV3(TestVulnerabilitySearch):
36193656 def check_url(self, url):
36203657 return v2_to_v3(url)
3658
36213659
36223660 def test_type_filter(workspace, session,
36233661 vulnerability_factory,
37353773 'parent_type': st.sampled_from([parent_type]),
37363774 'type': st.one_of(
37373775 st.sampled_from([
3738 "Vulnerability", "Invalid", None]),
3776 "Vulnerability", "Invalid", None]),
37393777 st.text()
37403778 ),
37413779 'ws': st.one_of(st.none(), st.text()),
37433781 'data': st.one_of(st.none(), st.text()),
37443782 'desc': st.one_of(st.none(), st.text()),
37453783 'easeofresolution': st.sampled_from(['trivial',
3746 'simple',
3747 'moderate',
3748 'difficult',
3749 'infeasible']),
3784 'simple',
3785 'moderate',
3786 'difficult',
3787 'infeasible']),
37503788 'impact': st.fixed_dictionaries({'accountability': st.booleans(), 'availability': st.booleans(),
3751 'confidentiality': st.booleans(),
3752 'integrity': st.booleans()}),
3789 'confidentiality': st.booleans(),
3790 'integrity': st.booleans()}),
37533791 'name': st.one_of(st.none(), st.text()),
37543792 'owned': st.booleans(),
37553793 'policyviolations': st.lists(st.one_of(st.none(), st.text())),
37563794 'refs': st.lists(st.one_of(st.none(), st.text())),
37573795 'resolution': st.one_of(st.none(), st.text()),
37583796 'severity': st.sampled_from(['critical',
3759 'high',
3760 'med',
3761 'medium',
3762 'low',
3763 'informational',
3764 'unclassified']),
3797 'high',
3798 'med',
3799 'medium',
3800 'low',
3801 'informational',
3802 'unclassified']),
37653803 'status': st.sampled_from(['open',
3766 'closed',
3767 're-opened',
3768 'risk-accepted']),
3804 'closed',
3805 're-opened',
3806 'risk-accepted']),
37693807 '_attachments': st.fixed_dictionaries({}),
37703808 'description': st.one_of(st.none(), st.text()),
37713809 'protocol': st.one_of(st.none(), st.text()),
37743812 vuln_dict.update({
37753813 '_id': st.integers(min_value=vuln.id, max_value=vuln.id),
37763814 'id': st.integers(min_value=vuln.id, max_value=vuln.id)
3777 })
3815 })
37783816 return st.fixed_dictionaries(vuln_dict)
37793817
37803818
37893827
37903828 @given(VulnerabilityData)
37913829 def send_api_create_request(raw_data):
3792
37933830 ws_name = host_with_hostnames.workspace.name
37943831 res = test_client.post(f'/v2/ws/{ws_name}/vulns/',
37953832 data=raw_data)
37973834
37983835 @given(VulnerabilityData)
37993836 def send_api_create_request_v3(raw_data):
3800
38013837 ws_name = host_with_hostnames.workspace.name
38023838 res = test_client.post(f'/v3/ws/{ws_name}/vulns/',
38033839 data=raw_data)
38053841
38063842 @given(VulnerabilityDataWithId)
38073843 def send_api_update_request(raw_data):
3808
38093844 ws_name = host_with_hostnames.workspace.name
38103845 res = test_client.put(f"/v2/ws/{ws_name}/vulns/{raw_data['_id']}",
3811 data=raw_data)
3846 data=raw_data)
38123847 assert res.status_code in [200, 400, 409, 405]
38133848
38143849 @given(VulnerabilityDataWithId)
38153850 def send_api_update_request_v3(raw_data):
3816
38173851 ws_name = host_with_hostnames.workspace.name
38183852 res = test_client.put(f"/v3/ws/{ws_name}/vulns/{raw_data['_id']}",
3819 data=raw_data)
3853 data=raw_data)
38203854 assert res.status_code in [200, 400, 409, 405]
38213855
38223856 send_api_create_request()
38473881 )
38483882 })
38493883
3884
38503885 @pytest.mark.usefixtures('logged_user')
38513886 @pytest.mark.hypothesis
38523887 @pytest.mark.usefixtures('ignore_nplusone')
0 #-*- coding: utf8 -*-
0 # -*- coding: utf8 -*-
11 '''
22 Faraday Penetration Test IDE
33 Copyright (C) 2013 Infobyte LLC (http://www.infobytesec.com/)
2626 from tests.utils.url import v2_to_v3
2727
2828 TEMPLATES_DATA = [
29 {'name': 'XML Injection (aka Blind XPath Injection) (Type: Base)',
30 'description': 'The software does not properly neutralize special elements that are u',
31 'resolution': 'resolved',
32 'severity': 'medium',
33 'create_date': datetime(2020, 5, 1, 11, 00),
34 'creator': 'testuser'
35 },
36 {'name': 'xml InjectioN (aka Blind XPath Injection) (Type: Base)',
37 'description': 'THE SOFtware does not properly neutralize special elements that are',
38 'resolution': 'not resolved',
39 'severity': 'high',
40 'create_date': datetime(2020, 6, 1),
41 'creator': 'testuser2'
42 }
29 {'name': 'XML Injection (aka Blind XPath Injection) (Type: Base)',
30 'description': 'The software does not properly neutralize special elements that are u',
31 'resolution': 'resolved',
32 'severity': 'medium',
33 'create_date': datetime(2020, 5, 1, 11, 00),
34 'creator': 'testuser'
35 },
36 {'name': 'xml InjectioN (aka Blind XPath Injection) (Type: Base)',
37 'description': 'THE SOFtware does not properly neutralize special elements that are',
38 'resolution': 'not resolved',
39 'severity': 'high',
40 'create_date': datetime(2020, 6, 1),
41 'creator': 'testuser2'
42 }
4343 ]
4444
4545
8181
8282 def _create_post_data_vulnerability_template(self, references):
8383 data = {
84 "exploitation":"high",
85 "references":references,
86 "name":"name",
84 "exploitation": "high",
85 "references": references,
86 "name": "name",
8787 "resolution": "resolution",
88 "cwe":"swe",
89 "description":"desc"}
88 "cwe": "swe",
89 "description": "desc"}
9090 return data
9191
9292 def test_create_new_vulnerability_template(self, session, test_client):
107107 'templates': TEMPLATES_DATA},
108108 {'field': 'name', 'op': 'eq', 'count': 1,
109109 'filtered_value': 'XML Injection (aka Blind XPath Injection) (Type: Base)',
110 'expected_template_name':TEMPLATES_DATA[0]['name'],
110 'expected_template_name': TEMPLATES_DATA[0]['name'],
111111 'templates': TEMPLATES_DATA},
112112 {'field': 'name', 'op': 'like', 'count': 1,
113113 'filtered_value': '% Injection (aka Blind XPath Injection)%',
165165 ))
166166 session.commit()
167167
168 query = self.check_url(f'/v2/vulnerability_template/filter?q={{"filters": [' \
169 f'{{ "name": "{filters["field"]}",' \
170 f' "op": "{filters["op"]}", ' \
171 f' "val": "{filters["filtered_value"]}" }}]}}')
168 query = self.check_url(f'/v2/vulnerability_template/filter?q={{"filters": ['
169 f'{{ "name": "{filters["field"]}",'
170 f' "op": "{filters["op"]}", '
171 f' "val": "{filters["filtered_value"]}" }}]}}')
172172
173173 res = test_client.get(query)
174174 assert res.status_code == 200
178178
179179 @pytest.mark.usefixtures('ignore_nplusone')
180180 @pytest.mark.parametrize('filters', [
181 {'field': 'creator_id', 'op': 'eq', 'count': 1,
182 'filtered_value': TEMPLATES_DATA[0]['creator'],
183 'expected_template_name': TEMPLATES_DATA[0]['name'],
184 'templates': TEMPLATES_DATA}
181 {'field': 'creator_id', 'op': 'eq', 'count': 1,
182 'filtered_value': TEMPLATES_DATA[0]['creator'],
183 'expected_template_name': TEMPLATES_DATA[0]['name'],
184 'templates': TEMPLATES_DATA}
185185 ])
186186 # TODO: fix filter restless to filter by username
187187 def test_filter_vuln_template_by_creator(self, session, test_client, filters):
203203 ))
204204 session.commit()
205205
206 query = self.check_url(f'/v2/vulnerability_template/filter?q={{"filters": [' \
207 f'{{ "name": "{filters["field"]}",' \
208 f' "op": "{filters["op"]}", ' \
209 f' "val": "{templates[0].creator.id}" }}]}}')
206 query = self.check_url(f'/v2/vulnerability_template/filter?q={{"filters": ['
207 f'{{ "name": "{filters["field"]}",'
208 f' "op": "{filters["op"]}", '
209 f' "val": "{templates[0].creator.id}" }}]}}')
210210
211211 res = test_client.get(query)
212212 assert res.status_code == 200
214214 if filters['count'] == 1:
215215 assert res.json['rows'][0]['doc']['name'] == templates[0].name
216216
217
218217 @pytest.mark.skip_sql_dialect('sqlite')
219218 @pytest.mark.usefixtures('ignore_nplusone')
220219 @pytest.mark.parametrize('filters', [
221 {'field': 'create_date', 'op': 'eq', 'count': 1,
222 'filtered_value': "2020-05-01",
223 'expected_template_name': TEMPLATES_DATA[0]['name'],
224 'templates': TEMPLATES_DATA}
220 {'field': 'create_date', 'op': 'eq', 'count': 1,
221 'filtered_value': "2020-05-01",
222 'expected_template_name': TEMPLATES_DATA[0]['name'],
223 'templates': TEMPLATES_DATA}
225224 ])
226225 def test_filter_vuln_template_by_create_date(self, session, test_client, filters):
227226 templates = []
242241 ))
243242 session.commit()
244243
245 query = self.check_url(f'/v2/vulnerability_template/filter?q={{"filters": [' \
246 f'{{ "name": "{filters["field"]}",' \
247 f' "op": "{filters["op"]}", ' \
248 f' "val": "{filters["filtered_value"]}" }}]}}')
244 query = self.check_url(f'/v2/vulnerability_template/filter?q={{"filters": ['
245 f'{{ "name": "{filters["field"]}",'
246 f' "op": "{filters["op"]}", '
247 f' "val": "{filters["filtered_value"]}" }}]}}')
249248
250249 res = test_client.get(query)
251250 assert res.status_code == 200
325324 """
326325
327326 raw_data = {
328 "id":123010,
327 "id": 123010,
329328 "cwe": "",
330329 "description": "test2",
331 "desc":"test2",
332 "exploitation":"critical",
333 "name":"test2",
334 "references":[],
335 "refs":[],
336 "resolution":"",
337 "type":"vulnerability_template"
330 "desc": "test2",
331 "exploitation": "critical",
332 "name": "test2",
333 "references": [],
334 "refs": [],
335 "resolution": "",
336 "type": "vulnerability_template"
338337 }
339338 res = test_client.post(self.url(), data=raw_data)
340339 assert res.status_code == 201
365364 session.commit()
366365
367366 raw_data = {
368 "id":123010,
367 "id": 123010,
369368 "cwe": "",
370369 "description": "test2",
371 "desc":"test2",
372 "exploitation":"critical",
373 "name":"test2",
374 "references":[],
375 "refs":[],
376 "resolution":"",
370 "desc": "test2",
371 "exploitation": "critical",
372 "name": "test2",
373 "references": [],
374 "refs": [],
375 "resolution": "",
377376 "type": "vulnerability_template",
378377 "customfields": {
379378 "cvss": "value",
400399 raw_data = {
401400 "cwe": "",
402401 "description": "test2",
403 "desc":"test2",
404 "exploitation":"critical",
405 "name":"test2",
406 "references":[],
407 "refs":[],
408 "resolution":"",
402 "desc": "test2",
403 "exploitation": "critical",
404 "name": "test2",
405 "references": [],
406 "refs": [],
407 "resolution": "",
409408 "type": "vulnerability_template",
410409 "customfields": {
411410 "cvss": "updated value",
482481 assert len(res.json['vulns_created']) == expected_created_vuln_template
483482 assert res.json['vulns_created'][0][1] == vuln_template_name
484483
485
486484 def test_add_vuln_template_missing_required_fields(self, session, test_client, csrf_token):
487485 expected_created_vuln_template = 1
488486 file_contents = b"""name,description\n
625623
626624 class TestListVulnerabilityTemplateViewV3(TestListVulnerabilityTemplateView, PatchableTestsMixin):
627625 def url(self, obj=None):
628 return v2_to_v3(super(TestListVulnerabilityTemplateViewV3, self).url(obj))
626 return v2_to_v3(super().url(obj))
629627
630628 def check_url(self, url):
631629 return v2_to_v3(url)
+0
-104
tests/test_api_websocket_auth.py less more
0 '''
1 Faraday Penetration Test IDE
2 Copyright (C) 2013 Infobyte LLC (http://www.infobytesec.com/)
3 See the file 'doc/LICENSE' for the license information
4
5 '''
6 from builtins import str
7
8 import pytest
9 from faraday.server.api.modules.websocket_auth import decode_agent_websocket_token
10 from tests.utils.url import v2_to_v3
11
12
13 class TestWebsocketAuthEndpoint:
14 def check_url(self, url):
15 return url
16
17 def test_not_logged_in_request_fail(self, test_client, workspace):
18 res = test_client.post(self.check_url(f'/v2/ws/{workspace.name}/websocket_token/'))
19 assert res.status_code == 401
20
21 @pytest.mark.usefixtures('logged_user')
22 def test_get_method_not_allowed(self, test_client, workspace):
23 res = test_client.get(self.check_url(f'/v2/ws/{workspace.name}/websocket_token/'))
24 assert res.status_code == 405
25
26 @pytest.mark.usefixtures('logged_user')
27 def test_succeeds(self, test_client, workspace):
28 res = test_client.post(self.check_url(f'/v2/ws/{workspace.name}/websocket_token/'))
29 assert res.status_code == 200
30
31 # A token for that workspace should be generated,
32 # This will break if we change the token generation
33 # mechanism.
34 assert res.json['token'].startswith(str(workspace.id))
35
36
37 class TestWebsocketAuthEndpointV3(TestWebsocketAuthEndpoint):
38 def check_url(self, url):
39 return v2_to_v3(url)
40
41
42 class TestAgentWebsocketToken:
43
44 def check_url(self, url):
45 return url
46
47 @pytest.mark.usefixtures('session') # I don't know why this is required
48 def test_fails_without_authorization_header(self, test_client):
49 res = test_client.post(
50 self.check_url('/v2/agent_websocket_token/')
51 )
52 assert res.status_code == 401
53
54 @pytest.mark.usefixtures('logged_user')
55 def test_fails_with_logged_user(self, test_client):
56 res = test_client.post(
57 self.check_url('/v2/agent_websocket_token/')
58 )
59 assert res.status_code == 401
60
61 @pytest.mark.usefixtures('logged_user')
62 def test_fails_with_user_token(self, test_client, session):
63 res = test_client.get(self.check_url('/v2/token/'))
64
65 assert res.status_code == 200
66
67 headers = [('Authorization', 'Token ' + res.json)]
68
69 # clean cookies make sure test_client has no session
70 test_client.cookie_jar.clear()
71 res = test_client.post(
72 self.check_url('/v2/agent_websocket_token/'),
73 headers=headers,
74 )
75 assert res.status_code == 401
76
77 @pytest.mark.usefixtures('session')
78 def test_fails_with_invalid_agent_token(self, test_client):
79 headers = [('Authorization', 'Agent 13123')]
80 res = test_client.post(
81 self.check_url('/v2/agent_websocket_token/'),
82 headers=headers,
83 )
84 assert res.status_code == 403
85
86 @pytest.mark.usefixtures('session')
87 def test_succeeds_with_agent_token(self, test_client, agent, session):
88 session.add(agent)
89 session.commit()
90 assert agent.token
91 headers = [('Authorization', 'Agent ' + agent.token)]
92 res = test_client.post(
93 self.check_url('/v2/agent_websocket_token/'),
94 headers=headers,
95 )
96 assert res.status_code == 200
97 decoded_agent = decode_agent_websocket_token(res.json['token'])
98 assert decoded_agent == agent
99
100
101 class TestAgentWebsocketTokenV3(TestAgentWebsocketToken):
102 def check_url(self, url):
103 return v2_to_v3(url)
131131 vulns += vulnerability_web_factory.create_batch(2, workspace=self.first_object,
132132 confirmed=True, status='open', severity='informational')
133133
134
135
136134 session.add_all(vulns)
137135 session.commit()
138136 res = test_client.get(self.url(self.first_object) + querystring)
144142 assert res.json['stats']['info_vulns'] == 2
145143 assert res.json['stats']['total_vulns'] == 2
146144
147
148145 @pytest.mark.parametrize('querystring', [
149146 '?status=closed'
150147 ])
195192 vulns += vulnerability_web_factory.create_batch(2, workspace=self.first_object,
196193 confirmed=True, status='open')
197194
198
199
200195 session.add_all(vulns)
201196 session.commit()
202197 res = test_client.get(self.url(self.first_object) + querystring)
205200 assert res.json['stats']['web_vulns'] == 2
206201 assert res.json['stats']['std_vulns'] == 11
207202 assert res.json['stats']['total_vulns'] == 13
208
209203
210204 @pytest.mark.parametrize('querystring', [
211205 '?confirmed=1',
230224 '?confirmed=0',
231225 '?confirmed=false'
232226 ])
233 def test_vuln_count_confirmed(self,
227 def test_vuln_count_confirmed_2(self,
234228 vulnerability_factory,
235229 test_client,
236230 session,
247241
248242 def test_create_fails_with_valid_duration(self, session, test_client):
249243 workspace_count_previous = session.query(Workspace).count()
250 start_date = int(time.time())*1000
251 end_date = start_date+86400000
244 start_date = int(time.time()) * 1000
245 end_date = start_date + 86400000
252246 duration = {'start_date': start_date, 'end_date': end_date}
253247 raw_data = {'name': 'somethingdarkside', 'duration': duration}
254248 res = test_client.post(self.url(), data=raw_data)
294288 session,
295289 test_client):
296290 workspace_count_previous = session.query(Workspace).count()
297 start_date = int(time.time())*1000
298 duration = {'start_date': start_date, 'end_date': start_date-86400000}
291 start_date = int(time.time()) * 1000
292 duration = {'start_date': start_date, 'end_date': start_date - 86400000}
299293 raw_data = {'name': 'somethingdarkside', 'duration': duration}
294 res = test_client.post(self.url(), data=raw_data)
295 assert res.status_code == 400
296 assert workspace_count_previous == session.query(Workspace).count()
297
298 def test_create_fails_with_forward_slash(self, session, test_client):
299 workspace_count_previous = session.query(Workspace).count()
300 raw_data = {'name': 'swtr/'}
300301 res = test_client.post(self.url(), data=raw_data)
301302 assert res.status_code == 400
302303 assert workspace_count_previous == session.query(Workspace).count()
363364
364365 @pytest.mark.skip # TODO fix fox sqlite
365366 def test_list_retrieves_all_items_from(self, test_client):
366 super(TestWorkspaceAPI, self).test_list_retrieves_all_items_from(test_client)
367 super().test_list_retrieves_all_items_from(test_client)
367368
368369 def test_workspace_activation(self, test_client, workspace, session):
369370 workspace.active = False
374375
375376 res = test_client.get(f'{self.url()}{workspace.name}/')
376377 active = res.json.get('active')
377 assert active == True
378 assert active
378379
379380 active_query = session.query(Workspace).filter_by(id=workspace.id).first().active
380 assert active_query == True
381 assert active_query
381382
382383 def test_workspace_deactivation(self, test_client, workspace, session):
383384 workspace.active = True
388389
389390 res = test_client.get(f'{self.url()}{workspace.name}/')
390391 active = res.json.get('active')
391 assert active == False
392 assert not active
392393
393394 active_query = session.query(Workspace).filter_by(id=workspace.id).first().active
394 assert active_query == False
395 assert not active_query
395396
396397 def test_create_fails_with_start_date_greater_than_end_date(self,
397398 session,
410411 return v2_to_v3(url)
411412
412413 def url(self, obj=None):
413 return v2_to_v3(super(TestWorkspaceAPIV3, self).url(obj))
414 return v2_to_v3(super().url(obj))
414415
415416 def test_workspace_activation(self, test_client, workspace, session):
416417 workspace.active = False
421422
422423 res = test_client.get(self.url(workspace))
423424 active = res.json.get('active')
424 assert active == True
425 assert active
425426
426427 active_query = session.query(Workspace).filter_by(id=workspace.id).first().active
427 assert active_query == True
428 assert active_query
428429
429430 def test_workspace_deactivation(self, test_client, workspace, session):
430431 workspace.active = True
435436
436437 res = test_client.get(self.url(workspace))
437438 active = res.json.get('active')
438 assert active == False
439 assert not active
439440
440441 active_query = session.query(Workspace).filter_by(id=workspace.id).first().active
441 assert active_query == False
442 assert not active_query
0 #-*- coding: utf8 -*-
0 # -*- coding: utf8 -*-
11 '''
22 Faraday Penetration Test IDE
33 Copyright (C) 2013 Infobyte LLC (http://www.infobytesec.com/)
77 from builtins import str
88 from posixpath import join as urljoin
99
10 from tests.utils.url import v2_to_v3
11
1210 """Generic tests for APIs prefixed with a workspace_name"""
1311
1412 import pytest
1513 from sqlalchemy.orm.util import was_deleted
16 from faraday.server.models import db, Workspace, Credential
14 from faraday.server.models import db
1715 from tests.test_api_pagination import PaginationTestsMixin as \
1816 OriginalPaginationTestsMixin
1917
2321
2422 @pytest.mark.usefixtures('logged_user')
2523 class GenericAPITest:
26
2724 model = None
2825 factory = None
2926 api_endpoint = None
6663 @pytest.fixture
6764 def mock_envelope_list(self, monkeypatch):
6865 assert self.view_class is not None, 'You must define view_class ' \
69 'in order to use ListTestsMixin or PaginationTestsMixin'
66 'in order to use ListTestsMixin or PaginationTestsMixin'
7067
7168 def _envelope_list(_, objects, pagination_metadata=None):
7269 return {"data": objects}
70
7371 monkeypatch.setattr(self.view_class, '_envelope_list', _envelope_list)
7472
7573 @pytest.mark.usefixtures('mock_envelope_list')
8886 session.commit()
8987 res = test_client.get(self.url())
9088 assert res.status_code == 200
89
9190
9291 class RetrieveTestsMixin:
9392
134133 db.session.commit()
135134 assert res.status_code == 403
136135 assert self.model.query.count() == count
137
138136
139137 def test_create_inactive_fails(self, test_client):
140138 self.workspace.deactivate()
272270
273271 @pytest.mark.parametrize("method", ["PUT", "PATCH"])
274272 def test_update_an_object(self, test_client, method):
275 super(PatchableTestsMixin, self).test_update_an_object(test_client, method)
273 super().test_update_an_object(test_client, method)
276274
277275 @pytest.mark.parametrize("method", ["PUT", "PATCH"])
278276 def test_update_an_object_readonly_fails(self, test_client, method):
279 super(PatchableTestsMixin, self).test_update_an_object_readonly_fails(test_client, method)
277 super().test_update_an_object_readonly_fails(test_client, method)
280278
281279 @pytest.mark.parametrize("method", ["PUT", "PATCH"])
282280 def test_update_inactive_fails(self, test_client, method):
283 super(PatchableTestsMixin, self).test_update_inactive_fails(test_client, method)
281 super().test_update_inactive_fails(test_client, method)
284282
285283 @pytest.mark.parametrize("method", ["PUT", "PATCH"])
286284 def test_update_fails_with_existing(self, test_client, session, method):
287 super(PatchableTestsMixin, self).test_update_fails_with_existing(test_client, session, method)
285 super().test_update_fails_with_existing(test_client, session, method)
288286
289287 def test_update_an_object_fails_with_empty_dict(self, test_client):
290288 """To do this the user should use a PATCH request"""
293291
294292 @pytest.mark.parametrize("method", ["PUT", "PATCH"])
295293 def test_update_cant_change_id(self, test_client, method):
296 super(PatchableTestsMixin, self).test_update_cant_change_id(test_client, method)
294 super().test_update_cant_change_id(test_client, method)
295
297296
298297 class CountTestsMixin:
299298 def test_count(self, test_client, session, user_factory):
306305 factory_kwargs[field] = value
307306
308307 session.add(self.factory.create(creator=self.first_object.creator,
309 workspace=self.first_object.workspace,
310 **factory_kwargs))
308 workspace=self.first_object.workspace,
309 **factory_kwargs))
311310
312311 session.commit()
313312
363362 assert creators == sorted(creators, reverse=True)
364363
365364
366
367365 class DeleteTestsMixin:
368366
369367 def test_delete(self, test_client):
389387 assert self.model.query.count() == OBJECT_COUNT
390388
391389 def test_delete_from_other_workspace_fails(self, test_client,
392 second_workspace):
390 second_workspace):
393391 res = test_client.delete(self.url(self.first_object,
394392 workspace=second_workspace))
395393 assert res.status_code == 404 # No content
450448 assert res.status_code == 200
451449 assert len(res.json['data']) == OBJECT_COUNT
452450
451
453452 class ReadWriteMultiWorkspacedAPITests(ReadOnlyMultiWorkspacedAPITests,
454453 ReadWriteTestsMixin):
455454 pass
1414 user = User.query.filter_by(username='test_change_pass').first()
1515
1616 assert not verify_password('old_pass', user.password)
17 assert verify_password('new_pass', user.password)
17 assert verify_password('new_pass', user.password)
77
88 from faraday.server.models import Host, Service, Vulnerability
99 import random
10
11
1012 def new_random_workspace_name():
1113 return ("aworkspace" + "".join(random.sample([chr(i) for i in range(65, 90)
12 ], 10 ))).lower()
14 ], 10))).lower()
15
1316
1417 def create_host(self, host_name="pepito", os="linux"):
1518 host = Host(host_name, os)
1619 self.model_controller.addHostSYNC(host)
1720 return host
1821
22
1923 def create_interface(self, host, iname="coqiuto", mac="00:03:00:03:04:04"):
2024 raise NotImplementedError()
2125
22 def create_service(self, host, interface, service_name = "coquito"):
26
27 def create_service(self, host, interface, service_name="coquito"):
2328 service = Service(service_name)
2429 self.model_controller.addServiceToInterfaceSYNC(host.getID(),
25 interface.getID(), service)
30 interface.getID(), service)
2631 return service
32
2733
2834 def create_host_vuln(self, host, name, desc, severity):
2935 vuln = Vulnerability(name, desc, severity)
3137
3238 return vuln
3339
40
3441 def create_int_vuln(self, host, interface, name, desc, severity):
3542 vuln = Vulnerability(name=name, description=desc, severity=severity)
3643 self.model_controller.addVulnToInterfaceSYNC(host.getID(), interface.getID(), vuln)
3744
3845 return vuln
3946
47
4048 def create_serv_vuln(self, host, service, name, desc, severity):
4149 vuln = Vulnerability(name=name, description=desc, severity=severity)
4250 self.model_controller.addVulnToServiceSYNC(host.getID(), service.getID(), vuln)
4351
4452 return vuln
45
46
47 # I'm Py3
33 See the file 'doc/LICENSE' for the license information
44
55 '''
6
7 # I'm Py3
2020
2121 assert session.query(Host).filter(
2222 Workspace.id == workspace.id
23 ).first() == None
23 ).first() is None
2424
2525
2626 def test_child_parent_verification_event_succeeds(session, workspace):
5454
5555 def test_child_parent_verification_event_changing_id_fails(session, workspace,
5656 second_workspace):
57
57
5858 session.add(workspace)
5959 session.add(second_workspace)
6060 session.commit()
6464 service = ServiceFactory.build(host=host, workspace_id=second_workspace.id)
6565
6666 session.add(service)
67
67
6868 with pytest.raises(AssertionError):
6969 session.commit()
7070
7171
72 # I'm Py3
72 # I'm Py3
2020
2121 def test_image_is_detected_correctly():
2222
23 with open(TEST_DATA_PATH / 'faraday.png', "rb")as image_data:
23 with open(TEST_DATA_PATH / 'faraday.png', "rb")as image_data:
2424 field = FaradayUploadedFile(image_data.read())
2525 assert field['content_type'] == 'image/png'
2626 assert 'thumb_id' in field.keys()
2929
3030
3131 def test_normal_attach_is_not_detected_as_image():
32 with open(TEST_DATA_PATH / 'report_w3af.xml', "rb")as image_data:
32 with open(TEST_DATA_PATH / 'report_w3af.xml', "rb")as image_data:
3333 field = FaradayUploadedFile(image_data.read())
3434 assert field['content_type'] == 'application/octet-stream'
3535 assert len(field['files']) == 1
2727 self.use_ldaps = ldap.use_ldaps
2828 self.use_start_tls = ldap.use_start_tls
2929
30
3130 def test_storage(self):
3231 from faraday.server.config import storage
3332 self.path = storage.path
34
35
36
37 # I'm Py3
55 from faraday.searcher.api import Api
66 from faraday.searcher.searcher import Searcher
77 from faraday.searcher.sqlapi import SqlApi
8 from faraday.server.models import Service, Host, VulnerabilityWeb
8 from faraday.server.models import Service, Host, VulnerabilityWeb, Rule
99 from faraday.server.models import Vulnerability, CommandObject
1010 from faraday.server.schemas import WorkerRuleSchema
1111 from faraday.utils.smtp import MailNotification
1919 ActionFactory,
2020 RuleActionFactory,
2121 UserFactory,
22 ConditionFactory,
2223 )
2324 from tests.factories import WorkspaceFactory, VulnerabilityFactory
2425
527528 searcher.process(rules)
528529 vuln = session.query(Vulnerability).get(vuln_id)
529530 assert vuln.severity == 'informational'
530
531531
532532 @pytest.mark.parametrize("api", [
533533 lambda workspace, test_client, session: Api(workspace.name, test_client, session, username='test',
854854 assert vulns_count == 10
855855
856856 searcher = Searcher(api(workspace, test_client, session))
857 rule_disabled = RuleFactory.create(object="severity=low", disabled=True, workspace=workspace)
858 rule_enabled = RuleFactory.create(object="severity=medium", disabled=False, workspace=workspace)
857 rule_disabled: Rule = RuleFactory.create(disabled=True, workspace=workspace)
858 rule_enabled = RuleFactory.create(disabled=False, workspace=workspace)
859 rule_disabled.conditions = [ConditionFactory.create(field='severity', value="low")]
860 rule_enabled.conditions = [ConditionFactory.create(field='severity', value="medium")]
859861
860862 action = ActionFactory.create(command='DELETE')
861863 session.add(action)
4242 self.assertEqual(res.status_code, 401)
4343
4444 def test_401_when_getting_an_existent_view_agent_token(self):
45 res = self.app.get('/', headers={'authorization':'agent 1234'})
45 res = self.app.get('/', headers={'authorization': 'agent 1234'})
4646 self.assertEqual(res.status_code, 401)
4747
4848 def test_401_when_getting_an_existent_view_user_token(self):
49 res = self.app.get('/', headers={'authorization':'token 1234'})
49 res = self.app.get('/', headers={'authorization': 'token 1234'})
5050 self.assertEqual(res.status_code, 401)
5151
5252 def test_401_when_posting_an_existent_view_and_not_logged(self):
53 res = self.app.post('/', data={'data':'data'})
53 res = self.app.post('/', data={'data': 'data'})
5454 self.assertEqual(res.status_code, 401)
5555
5656 def test_401_when_accessing_a_non_existent_view_and_not_logged(self):
57 res = self.app.post('/dfsdfsdd', data={'data':'data'})
57 res = self.app.post('/dfsdfsdd', data={'data': 'data'})
5858 self.assertEqual(res.status_code, 401)
5959
6060 def test_200_when_not_logged_but_endpoint_is_public(self):
9191 if __name__ == '__main__':
9292 unittest.main()
9393
94
9594 # I'm Py3
3434 assert copy_default_config_to_local() is None
3535 assert not copyfile.called
3636
37
3738 VERSION_PATTERN = r"""
3839 v?
3940 (?:
7071 re.VERBOSE | re.IGNORECASE,
7172 )
7273
74
7375 def isPEP440(arg):
7476 return not _regex.match(arg) is None
7577
78
7679 def test_exists_and_content():
7780 assert isPEP440(__version__)
78
79
80 # I'm Py3
33
44 from faraday.server.utils.filters import FilterSchema
55 from faraday.server.utils.filters import FlaskRestlessSchema
6 from faraday.server.models import VulnerabilityWeb
76
87
98 class TestFilters:
1918
2019 def test_restless_using_order_by(self):
2120 test_filter = {
22 "order_by":[
23 {"field":"host__vulnerability_critical_generic_count"},
24 {"field":"host__vulnerability_high_generic_count"},
25 {"field":"host__vulnerability_medium_generic_count"},
21 "order_by": [
22 {"field": "host__vulnerability_critical_generic_count"},
23 {"field": "host__vulnerability_high_generic_count"},
24 {"field": "host__vulnerability_medium_generic_count"},
2625 ],
2726 "filters": [{
2827 "or": [
3534 res = FlaskRestlessSchema().load(test_filter)
3635 assert res == test_filter
3736
38
3937 def test_FlaskRestlessSchema_(self):
4038 test_filter = [{"name": "severity", "op": "eq", "val": "low"}]
4139 res = FlaskRestlessSchema().load(test_filter)
4442 def test_simple_and_operator(self):
4543 test_filter = {"filters": [
4644 {'and': [
47 {"name": "severity", "op": "eq", "val": "low"},
48 {"name": "severity", "op": "eq", "val": "medium"}
49 ]
45 {"name": "severity", "op": "eq", "val": "low"},
46 {"name": "severity", "op": "eq", "val": "medium"}
47 ]
5048 }
5149
5250 ]}
181179 else:
182180 assert and_op == {"name": "severity", "op": "eq", "val": "high"}
183181
184
185182 def test_case_1(self):
186183 filter_schema = FilterSchema()
187184 filters = {'filters': [{"name": "confirmed", "op": "==", "val": "true"}]}
196193
197194 def test_case_3(self):
198195 filters = {'filters': [
196 {"and": [
199197 {"and": [
200 {"and": [
201 {"name": "severity", "op": "eq", "val": "critical"},
202 {"name": "confirmed", "op": "==", "val": "true"}
203 ]},
204 {"name": "host__os", "op": "has", "val": "Linux"}
205 ]}
206 ]}
198 {"name": "severity", "op": "eq", "val": "critical"},
199 {"name": "confirmed", "op": "==", "val": "true"}
200 ]},
201 {"name": "host__os", "op": "has", "val": "Linux"}
202 ]}
203 ]}
207204 res = FilterSchema().load(filters)
208205 assert res == filters
209206
210207 def test_test_case_recursive(self):
211208 filters = {"filters":
212 [{"or":[
213 {"name":"severity","op":"eq","val":"medium"},
214 {"or":[
215 {"name":"severity","op":"eq","val":"high"},
216 {"and":[
217 {"and":[
218 {"name":"severity","op":"eq","val":"critical"},
219 {"name":"confirmed","op":"==","val":"true"}
220 ]},
221 {"name":"host__os","op":"has","val":"Linux"}
209 [{"or": [
210 {"name": "severity", "op": "eq", "val": "medium"},
211 {"or": [
212 {"name": "severity", "op": "eq", "val": "high"},
213 {"and": [
214 {"and": [
215 {"name": "severity", "op": "eq", "val": "critical"},
216 {"name": "confirmed", "op": "==", "val": "true"}
217 ]},
218 {"name": "host__os", "op": "has", "val": "Linux"}
222219 ]}
223220 ]}
224221 ]}
225 ]}
222 ]}
226223 res = FilterSchema().load(filters)
227224 assert res == filters
228225
229226 def test_case_recursive_2(self):
230227 filters = {'filters': [
231 {"and": [
232 {"and": [
233 {"name": "severity", "op": "eq", "val": "critical"},
234 {"name": "confirmed", "op": "==", "val": "true"}
235 ]},
236 {"name": "host__os", "op": "has", "val": "Linux"}
237 ]}
238 ]}
228 {"and": [
229 {"and": [
230 {"name": "severity", "op": "eq", "val": "critical"},
231 {"name": "confirmed", "op": "==", "val": "true"}
232 ]},
233 {"name": "host__os", "op": "has", "val": "Linux"}
234 ]}
235 ]}
239236
240237 res = FilterSchema().load(filters)
241238 assert res == filters
5252 map(lambda column: column.strip("'')").strip('-1').strip('-1));').strip(), statements_clean)
5353 )
5454 )
55 statements_clean.remove('source_code_id') # we don't support source_code yet
55 statements_clean.remove('source_code_id') # we don't support source_code yet
5656 unique_constraints = get_unique_fields(session, Vulnerability())
5757 for unique_constraint in unique_constraints:
5858 assert len(statements_clean) == len(unique_constraint)
6969 for unique_constraint in unique_constraints:
7070 assert unique_constraint == expected_unique_fields
7171
72
7372 # I'm Py3