The primary use case for the IriusRisk Analytics module is to expose the information from the IriusRisk platform to the users through the use of dashboards. In their raw form a dashboard allows users to present data from a query (dataset) via a widget (visualization). This visualization by default is a data grid (a report) which includes the fields defined within the query. It is then possible to format this data, into a variety of charts such as , pie, column, donut, bar, area, or line.
In many cases organizations wish to aggregate the threat modelling data into centrally managed BI tools. In order to do so, there are two alternatives:
-
IriusRisk Platform API - https://app.swaggerhub.com/apis/continuumsecurity/IriusRisk - information is extracted and presented as json (if you want to pretty print it look at https://stedolan.github.io/jq/ )
-
IriusRisk Analytics Module API - each instance of he analytics engine has built in documentation on how to use the analytics module, reference https://<your analytics instance>.iriusrisk.com/analytics/docs
For information which is not accessible via the IriusRisk Platform API, but is stored within the IriusRisk Platform Database such as the Audit Event information, then this data can only be accessed via the IriusRisk Analytics Module API.
The IriusRisk Analytics Module API provides a convenient mechanism for extracting and formatting (json/csv) data that customers can consume in 3rd party solutions.
Identifying your Data Set - Use a Query
Analytics Queries
Your data set is defined by a query, this query can be generated via the Analytics UI (noSQL) or by coding SQL. There are also a series of pre-defined queries which are supplied within the project to describe the default dashboards.
New Queries can be generated selecting NEW QUERY (you can edit an existing query if that is close to your requirements)
DataSource (in the case of simple use cases) use the IriusRisk Datasource (defaultIriusRisk)
You can then start creating a query across the database tables within the Analytics Module. By default you can create the query using the visual editor, which provides a drag and drop facility (tables) to populate the query.
Start Creating a Query
Alternatively, you can switch focus to the Editor, and create your queries directly using SQL. You can even take the example queries below and paste the SQL directly into the editor.
Don't forget to give your query a name in the settings, and set your data strategy. Either run once, or configure as a direct query. You can then SAVE&RUN, define your visualization, and/or run the query using the API (see below).
Example Queries
NOTE you can use the SQL described in the right most column to define your own queries.
NOTE by default the output is limited to 10000 lines. This can be removed as necessary, but may affect the performance of the service.
Audit Log
Visual Editor:
table = audit_event
SQL Editor:
PostgreSQL Query:
with user_bu as (select u.username as username , array_remove(array_agg(bu.name), NULL) as bus from users u
left join user_group ug on ug.user_id = u.id
left join ugroup bu on ug.ugroup_id = bu.id
group by u.username)
select ae.event_type, ae.timestamp, ae.project_ref,
ae.username, ae.details,user_bu.bus
from "audit_event" ae
join user_bu on user_bu.username = ae.username
order by "timestamp" desc;
Cloud9QL Transformations:
Select event_type as Event type,
DATE_FORMAT(timestamp, 'dd/MM/yyyy HH:MM:ss') as Event Date,
project_ref as Affected Project,
username as User,
bus as User Business Units,
details as Details
Deleted Projects
Visual Editor:
table = audit_event
filter = event_type Equals PROJECT_DELETED
SQL Editor:
PostgreSQL Query:
select *
from "audit_event"
where "event_type" = 'PROJECT_DELETED'
limit 10000
Cloud9QL Transformations:
Not Applicable
All answers to the Component Questionnaire (Project wise):
Visual Editor:
N/A
SQL Editor:
PostgreSQL Query:
WITH question as (select * from (
select question[1] as question, data[1] as answer_ref, data[2] as answer_text from rule r
join risk_action_rule ar on r.id = ar.rule_id
join risk_action a on a.id = ar.risk_action_id
join condition_rule cr on r.id = cr.rule_id
join condition c on c.id = cr.condition_id,
string_to_array(ar.value,'_::_') asdata,
string_to_array(cr.value,'_::_') as question
where c.ref = 'CONDITION_COMPONENT_QUESTION_GROUP_EXISTS'
and a.ref = 'INSERT_COMPONENT_QUESTION') as relationship
join (
select distinctdata[1] as ref, data[3] astextfrom risk_action_rule rar
join risk_action ra on rar.risk_action_id = ra.id, string_to_array(rar.value,'_::_') asdata
where ra.ref = 'INSERT_COMPONENT_QUESTION_GROUP') as question on relationship.question = question.ref),
bu_nested AS ( select p.name as project, array_remove(array_agg(bu.name), NULL) as bus from project p
left join projectugroup pu on pu.project_id = p.id
left join ugroup bu on pu.ugroup_id = bu.id
where p.type = 'STANDARD'
andnotexists (select1from project_version where project_id = p.id)
group by p.name)
select q.answer_text as answer_text, q.text as question_text, bu.bus as business_unit , c.name as component, p.name as project
from question as q
join component_wizard_question as cwq on q.answer_ref = cwq.ref
join component as c on cwq.component_id = c.id
join project as p on c.project_id = p.id
join bu_nested bu on bu.project = p.name
Cloud9QL Transformations:
Select project as 'Project',
business_unit as 'Project BUs',
component as 'Component',
question_text as 'Question',
answer_text as 'Answer'
Progress on projects:
Progress on projects (Diagram created, Last Diagram modified Date, Security Review performed, Last Security Review Date, Remediation progress(no of threats mitigated / total number of threats), Last Remediation Date, Completed Projects, Date of completion).
- Diagram Created. Base 64 image
- Last Diagram Modified Date. Last model synchronization
- Security Review performed. All the projects that are in a completed state should show Security Review performed
- Last Security Performed Date: Date when the project moved from Security Review state to completed state
- Remediation progress(no of threats mitigated / total number of threats)
- Last Remediation Date
- Completed Projects
- Date of completion
Visual Editor:
N/A
SQL Editor:
PostgreSQL Query:
WITH last_diagram_change as (
select p.id as project_id, ar.content, max(timestamp) as last_diagram_modified_date
from artifact as ar
inner join project as p on p.id = ar.project_id
where ar.show=true
and p.type = 'STANDARD'
group by ar.content, p.id
),
last_state_change as (select project_id, project_name, state, ws_name, max(date) as date_of_completion from (
select pa.id as project_id,p.name as project_name, pa.workflow_state_id asstate,
ws.name as ws_name,
rev.rev_timestamp asdate,
LAG (pa.workflow_state_id, 1)
OVER (
PARTITIONBY pa.id
ORDER BY rev.rev_timestamp
) previous_state
from project_audit pa
join revinfo as rev on pa.rev_id = rev.rev
join workflow_state ws on ws.id = pa.workflow_state_id
join project p on p.id = pa.id and p.workflow_state_id = pa.workflow_state_id
where p.type = 'STANDARD'
) as historic
wherestateisdistinctfrom previous_state
group by project_id,project_name, state, previous_state, ws_name),
total_threats_project as (
select
count(t.id) as total_threats,
p.id as project
from threat t
join component u on t.component_id = u.id
join project p on u.project_id = p.id
wherenotexists (select1from project_version where project_id = p.id)
and p.type = 'STANDARD'
group by project
),
total_threats_mitigated_project as (
select
count(t.id) as total_threats_mitigated,
max(tce.timestamp) as last_remediation_date,
p.id as project
from threat t
join component u on t.component_id = u.id
join project p on u.project_id = p.id
join threat_change_event tce on t.id = tce.threat_id and tce.state='MITIGATE'and tce.project_id = p.id
wherenotexists (select1from project_version where project_id = p.id)
and p.type = 'STANDARD'
and t.state='MITIGATE'
group by project
)
select
lsc.project_name as project_name,
ws_name as workflow_state,
to_timestamp(date_of_completion/ 1000) as date_of_completion,
last_diagram_modified_date as last_diagram_modified_date,
content as content,
total_threats,
total_threats_mitigated
,last_remediation_date
from last_state_change lsc
left join total_threats_project ttp on ttp.project = lsc.project_id
left join total_threats_mitigated_project ttmp on ttmp.project = ttp.project
left join last_diagram_change ldc on ldc.project_id = lsc.project_id
Cloud9QL Transformations:
select
content as Diagram Created,
last_diagram_modified_date asLast Diagram Modified Date,
project_name as Project Name,
workflow_state as Work flow state,
date_of_completion asLastSecurity Performed Date,
CASE
WHEN total_threats > 0
THEN total_threats
ELSE0
ENDAS'Total threats',
CASE
WHEN total_threats_mitigated > 0
THEN total_threats_mitigated
ELSE0
ENDAS'Total threats mitigated',
last_remediation_date asLast Remediation Date
;
select *
Extracting your data
Under normal circumstances, then you would want to access the information generated by your query via the Analytics Dashboard, in which case you would create a widget/visulation and include that within your dashboard(s). But if you want to collect the data and re-use it within another tool, then you need to utilize the Analytics Engine API.
https://<your.analytics.instance>.iriusrisk.com/analytics/docs/managementAPI.html
Enable API access
from your Account Settings; if you do not have access to this, then you need to speak to your administrator.
Both the Client ID and Client Secret will be provisioned as a UUID (obfuscated here), you will need these both to be able to create a session to be able to access the analytics module data.
Login
curl -i -d "client_id=<your client id>&client_secret=<your client secret>" -X POST <https://<your analytics instance>.iriusrisk.com/analytics/api/1.0/login
This will return your bearer token, which you can use within the session to access the analytics data, you do not need to re-use the client id or secret. The access token will change every time you login to the service, by default the session lasts 3600 seconds (1 hour). You should use the logout API call to terminate the session upon completion of your activities, this is good security practice.
{"access_token":"<your session access token>","token_type":"Bearer","expires_in":3600}
Identify your Query
Each query has many unique attributes with which to identify them, entityName, identifier. This information can be retrieved from the UI or via the API, using the bearer token (session access token from the login API call above).
curl -s -H "Authorization: Bearer <your session access token>" "https://<your analytics instance>.iriusrisk.com/analytics/api/1.0/queries"
this returns raw json, which is a list of the data sets (queries) that you as a user have access to. If you “pretty print the output”, each query is represented by the following content:
Json Snippet of all Queries
In this example the entityName and the identifier representing the query, is:
Create our Data Set
In order to create the required data set, you need to identify the query that you need to run (from the above output), either the entityName:
curl -i -H "Authorization: Bearer <your session access token>" -G "https://<your analytics instance>.iriusrisk.com/analytics/api/1.0/datasets" --data-urlencode "entityName=dD - Deleted Projects"
or the identifier:
curl -s -H "Authorization: Bearer <your session access token>" "https://<your analytics instance>.iriusrisk.com/analytics/api/1.0/datasets?identifier=BNmR4z"
By default the output is json, you can force the format using either:
curl -i -H "Authorization: Bearer <your session access token>" -G "https://<your analytics instance>.iriusrisk.com/analytics/api/1.0/datasets?exportFormat=csv" --data-urlencode "entityName=dD - Deleted Projects"
or:
curl -s -H "Authorization: Bearer <your session access token>" "https://<your analytics instance>.iriusrisk.com/analytics/api/1.0/datasets?identifier=BNmR4z&exportFormat=csv"
where exportFormat can be one of csv or json.
returning the results of a query (identifying the deleted projects from the audit log).
Running the Query Deleted Projects (based on the audit_event table) from earlier results in the following output (json):
[
{"id":204902,"details":"rgarciap","event_type":"PROJECT_DELETED","project_ref":null,"username":"analyticsdemo","version_id":null,"timestamp":1661435499230},
{"id":204920,"details":"David Doughty","event_type":"PROJECT_DELETED","project_ref":null,"username":"ddoughty","version_id":null,"timestamp":1661503971332},
{"id":205129,"details":"rg - to delete","event_type":"PROJECT_DELETED","project_ref":null,"username":"analyticsdemo","version_id":null,"timestamp":1661838985511},
{"id":220329,"details":"CloudApp","event_type":"PROJECT_DELETED","project_ref":null,"username":"kchau","version_id":null,"timestamp":1662129175733}
]
(CSV):
id,details,event_type,project_ref,username,version_id,timestamp
204902,rgarciap,PROJECT_DELETED,,analyticsdemo,,"08/25/2022 13:51:39"
204920,David Doughty,PROJECT_DELETED,,ddoughty,,"08/26/2022 08:52:51"
205129,rg - to delete,PROJECT_DELETED,,analyticsdemo,,"08/30/2022 05:56:25"
220329,CloudApp,PROJECT_DELETED,,kchau,,"09/02/2022 14:32:55"
Comments
0 comments
Please sign in to leave a comment.