This is documentation for integrating and sending data from third-party services to Shuffle. Not to be confused with apps and workflows
From the start, Shuffle has been a platform about integrations. We've focused on making them as open and usable as possible, but were missing one part; inbound data. The general way Shuffle handles this has been through third-party API's, where we poll for data on a schedule. There are however some cases where this doesn't do the trick. That's what extensions are.
These integrations will typically entail third party services connecting to Shuffle with inbound Webhooks as triggers in a workflow.
Shuffle added Single Signon (SAML) from version 0.9.16 & OpenID since 1.0.0. This allows you to log into Shuffle from other sources, entirely controlled by your external environment. SSO is available for onprem, even without the Enterprise version of Shuffle cloud. It works by setting an Entrypoint (IdP) and X509 Certificate, both used to validate the requests. This can be added under /admin, and only works for your PRIMARY organization.
ONPREM ONLY: You will have to change the SSO_REDIRECT_URL variable in the .env file to match your front end server link i.e SSO_REDIRECT_URL=http://<URL>:<PORT>
How it works:
PS: In some cases, the persons' username may be appear an ID in Shuffle. If so, an admin should login to Shuffle and change their username. SSO should still work.
PPS: The callback URL/redirect URL/recipient URL is https://<URL>:<PORT>/api/v1/login_sso
To use Okta SSO with Shuffle, first make an app in Okta. Their guide for making an app can be found here.
Once an application is made, it's time to find the required information. Go to the app > Sign On > Click "View Setup Instructions" under SAML 2.0. This will open a new page with the credentials.
Move these fields over to Shuffle:
After adding them, click "Save", saving the configuration. After saving, log out of your user to verify the SSO configuration. If you don't see a button for "Use SSO", you most likely configured the wrong organization.
To use Auth0 SSO with Shuffle, first make an app on https://manage.auth0.com. Documentation can be found here.
After the app is made, click "Addons" > "SAML2 Web App".
In the popup, move the data of these fields to Shuffle:
Open the Certificate file in a text editor, and copy it's contents.
After adding them, click "Save", saving the configuration. After saving, log out of your user to verify the SSO configuration. If you don't see a button for "Use SSO", you most likely configured the wrong organization.
To use PingID SSO with Shuffle, first make an app on https://console.pingone.eu/. Documentation can be found here.
After the app is made, click the dropdown for it on the right side > Configuration > find these fields.
In the view above, move the data of these fields to Shuffle:
Open the Certificate file in a text editor, and copy it's contents in the field.
After adding them, click "Save", saving the configuration. After saving, log out of your user to verify the SSO configuration. If you don't see a button for "Use SSO", you most likely configured the wrong organization.
Open ID SSO setup with Keycloak:
You now have a client. Click on settings and configure them as follows -Your client ID -Set up a name -Client Protocol set to openid-connect -Access Type set to public -Standard Flow Enabled toggled to ON -Direct access grants toggled to ON
For the Valid Redirect URI http://
Under the Fine Grain OpenID Connect Configuration
Valid Request URIs http://
Once this is done head over to your shuffle instance.
1. Click on Admin button
2. Scroll down and click on the downward facing arrow beneath Organization overview
3. Scroll down again to OpenID connect
4. Fill in the client ID (It should be the same as what you entered in Keycloak)
5. Authorization URL http://<Your_Keycloak_URL>:
If you keep getting redirected to your backend url, head on to your Shuffle folder on your server. 1. Vim .env
Finally go back to shuffle and use SSO button to login.
To use OpenID with Azure AD, Shuffle supports OpenID connect with the use of Client IDs and Client secrets. To set up OpenID Connect with Azure, we use "ID_token" authentication. This entails a few normal steps regarding app creation in Azure App Registration.
Set up an app in Azure AD with ID tokens enabled
Go to app registrations and create a new app. Make it use "Web" for redirect URI's and direct it to your Shuffle instance at /api/v1/openid_connect. From here, make sure to go to "authentication" and enable "ID Tokens"
Get the Client ID, Client Secret and your Tenant
Start by generating a client secret. Keep it safe, as we'll use it later.
The client secret and tenant ID can be found in the "Overview" tab:
Go to the admin panel in Shuffle and put in the Client ID and Secret
Put in your Tenant ID in the authorization URL
The URL is as such: https://login.microsoftonline.com/TENANT_ID/oauth2/v2.0/authorize
. The Token URL is not strictly required for ID Token auth.
Done! Click save and log out. Try your new login based on your Azure AD configuration.
PS: When the user is signed in, they have the access rights of a "user" in the designated organization, and will have a username according to the ID decided in O365. This can be changed by admins.
As long as you can create an identity and acquire an Entrypoint (IdP) and X509, paste them into the Shuffle fields, and it should work with any SAML/SSO provider.
Wazuh is a SIEM platform for security operations. We've used it through their API multiple ways, but were missing an important component; alerting. That's why we've developed a simple alert forwarder from Wazuh to Shuffle.
PS: If you expect more than 10 alerts per second, you should add multiple workflows AND webhook forwarders to Wazuh
These are the steps to set it up: 1. Create a Workflow which will receive alerts 2. Add a Webhook to the Workflow 3. Configure Wazuh with the Webhook URL 4. Test the integration
1. Create a Workflow which will receive alerts This one is pretty easily explained. Go to Shuffle an make a new Workflow.
2. Add a Webhook to the workflow Add a webhook and find the Webhook URL. Remember to start the Webhook!
Copy the URL and keep it for the next steps
3. Configure Wazuh with the Webhook URL Start by logging into your Wazuh management console with access to edit the ossec.conf file. We'll first start by adding the Shuffle webhook forwarder.
[root@wazuh]# pwd
/var/ossec/integrations
[root@wazuh]# ls -alh
total 76K
drwxr-x---. 2 root ossec 187 Apr 25 04:34 .
drwxr-x---. 19 root ossec 242 Dec 14 09:40 ..
-rwxr-x--- 1 ossec ossec 1.1K Dec 20 04:50 custom-shuffle
-rwxr-x--- 1 ossec ossec 4.5K Jan 19 09:32 custom-shuffle.py
-rwxr-x---. 1 root ossec 4.3K Nov 30 08:41 pagerduty
-rwxr-x---. 1 root ossec 1.1K Nov 30 08:42 slack
-rwxr-x---. 1 root ossec 3.8K Nov 30 08:41 slack.py
-rwxr-x---. 1 root ossec 1.1K Nov 30 08:42 virustotal
-rwxr-x---. 1 root ossec 6.3K Nov 30 08:41 virustotal.py
$ chown root:ossec custom-shuffle
$ chown root:ossec custom-shuffle.py
$ chmod 750 custom-shuffle
$ chmod 750 custom-shuffle.py
Go to the location /var/ossec/etc/ossec.conf (or where the ossec.conf file is). Take the information below (including
Find more fields like levels, groups and rule_id here.
<integration>
<name>custom-shuffle</name>
<level>5</level>
<hook_url>http://IP:PORT/api/v1/hooks/webhook_hookid</hook_url>
<alert_format>json</alert_format>
</integration>
systemctl restart wazuh-manager.service
You should now start seeing data sent from Wazuh into Shuffle which can be used. If data is NOT sent, make sure of a few things:
4. Test the integration There are many ways to test the integration, but you can simplify it by setting the "level" part of the configuration to a lower number (3~), as that would trigger it in a lot of cases, including when you SSH into the Wazuh manager. After an alert is supposed to have triggered, go to Shuffle, and you'll see something like the image below.
Active response can be used with the command "Run Command" in the Wazuh app in Shuffle. This requires an Agent (Agent list) and Command (Active response command). Below, we will show how to set up the custom command "reboot".
Testing custom active response from Shuffle: The goal with this section is to set up a bash script that can run custom commands from within Shuffle.
#!/bin/bash
# Extra arguments
# Installing jq
if ! command -v jq &> /dev/null
then
echo "jq could not be found - installing"
sudo apt install jq -y
sudo yum install jq -y
fi
read -r INPUT_JSON
CMD=$(echo $INPUT_JSON | jq -r .parameters.alert.cmd)
CALLBACK=$(echo $INPUT_JSON | jq -r .parameters.alert.callback)
OUTPUT=$($CMD)
curl -XPOST $CALLBACK -d """$OUTPUT""" -k
shuffle - shuffle.sh - 0
/var/ossec/bin/wazuh-control restart
The default from this workflow is that it will reboot the server.
TheHive is a case management platform for and by security professionals. One of their key capabilities is webhooks, which can send realtime updates to a third party system whenever ANYTHING is changed within TheHive (e.g. a new alert or a case task is written). Shuffle has an ideal way of handling this, outlined in this blogpost (TheHive4).
PS: There is a difference between TheHive3 and TheHive 4 on how to set this up. We are referring to TheHive4 in this section.
1. Create a Workflow in Shuffle This one is pretty easily explained. Go to Shuffle an make a new Workflow.
2. Add a Webhook to the workflow Add a webhook and find the Webhook URL. Remember to start the Webhook!
Copy the URL and keep it for the next steps
3. Configure TheHive with the Webhook URL Configuring TheHive is the only unique step here, and is different between versions. Full documentation about TheHive4 webhooks can be found here.
As can be seen in the documentation, there are three steps to setting up TheHive webhooks: 1. Define the webhook forwarder:
Find the application.conf file and scroll to the webhook section. If it doesn't exist, add the following:
notification.webhook.endpoints = [
{
name: Shuffle
url: "http://IP:PORT/api/v1/hooks/webhook_hookid"
version: 0
wsConfig: {}
includedTheHiveOrganisations: ["*"]
excludedTheHiveOrganisations: []
}
]
Modify it to have the URL you found making the webhook in Shuffle. You can also use https by adding certificate references to wsConfig{}
systemctl restart THeHive
curl -XPUT -u$thehive_user:$thehive_password -H 'Content-type: application/json' $thehive_url/api/config/organisation/notification -d '
{
"value": [
{
"delegate": false,
"trigger": { "name": "AnyEvent"},
"notifier": { "name": "webhook", "endpoint": "Shuffle" }
}
]
}'
4. Test the integration In TheHive UI (NOT CLI), create a new case, or add a comment to an existing case. This will then be seen within Shuffle. After the webhook is supposed to have triggered, go to Shuffle, and you'll see something like the image below.
1. Create a Workflow which will receive alerts This one is pretty easily explained. Go to Shuffle an make a new Workflow.
2. Add a Webhook to the workflow Add a webhook and find the Webhook URL. Remember to start the Webhook!
Copy the URL and keep it for the next steps
3. Configure Logz.io forwarding After logging into app.logz.io, hover Settings Icon > Settings > click Notifications Endpoint. This will make all rules send alerts to the assigned webhook target.
Once in the notification endpoint view, click "Add Endpoint". Configure the following elements:
{
"alert_title": "{{alert_title}}",
"alert_description": "{{alert_description}}",
"alert_severity": "{{alert_severity}}",
"alert_event_samples": "{{alert_samples}}"
}
Custom data will be parsed into the field "alert_event_samples"
4. Test the integration Click "Run the test" at the bottom before saving. This allows for it to send a sample payload to Shuffle.
If the data payload is configured as in step 3, the custom data will be available as such:
$exec.alert_event_samples
MISP, short for Malware Information Sharing Platform, is one of the best Open Source alternatives for Threat Intelligence. For that reason, a lot of our users have wanted a way to handle data in realtime from MISP. What kind of data? Event updates, indicator updates, IDS flag edits, Organization edit etc.
That's why we released an extension for Shuffle which can read ZMQ messages from MISP in realtime and send them to a webhook.
Steps to set it up: 1. Enable ZMQ in MISP by going to Server Settings -> Plugins in MISP. Make sure to enable the options for forwarding Events and Attributes. 2. Install pyzmq and redis on the MISP server:
pip3 install pyzmq redis
version: '3'
services:
zmq:
image: ghcr.io/frikky/shuffle-zmq:latest
container_name: shuffle-zmq
hostname: shuffle-zmq
environment:
- ZMQ_HOSTNAME=10.1.2.3
- ZMQ_PORT=9000
- ZMQ_FORWARD_URL=<WEBHOOK_URL>
restart: unless-stopped
docker-compose up -d
Ever wanted to run an action as soon as something is uploaded to your S3 Bucket? Maybe you just want to build an S3 honeypot? If so, this part is for you. Herein we describe how you can get information about files being uploaded or changed from an S3 bucket into Shuffle.
Basic overview: User -> Upload to S3 -> S3 triggers Lambda function -> Lambda function triggers Webhook -> File is downloaded from within Shuffle
Steps to set it up: 1. Create a Workflow with Webhook inside Shuffle 2. Create a Lambda function that triggers on S3 changes 3. Configure the workflow to download the file 4. Optional: Scan file with Yara
1. Create a Workflow which will receive alerts This one is pretty easily explained. Go to Shuffle an make a new Workflow. Add a webhook and find the Webhook URL. Remember to start the Webhook!
Copy the URL and keep it for the next steps
2. Create the lambda function
After logging into your AWS, go to Lambda functions and click "Create function" in the top right corner.
Use "Author from scratch", and type in a name like "Shuffle-forwarding" and make sure to choose Runtime as Python 3.8. Click "Create function" in the bottom right corner.
Click "Add trigger" in the window, left of the function. In the next menu find "S3", before choosing the bucket you want and the "Event type". Click "Add trigger"
PS: The bucket and cloud function have to be in the same location
Under Configuration > Environment variables, click "Edit". Add variable with key "SHUFFLE_WEBHOOK", and the value from step 1. Click "Save".
Time to add some code. Go to the "code" tab and paste in the code below. Click "deploy". This should now forward the request to Shuffle.
import json
import urllib.parse
import requests
import os
def lambda_handler(event, context):
# Get the object from the event and show its content type
#bucket = event['Records'][0]['s3']['bucket']['name']
webhook = os.environ.get("SHUFFLE_WEBHOOK")
if not webhook:
return "No webhook environment defined: SHUFFLE_WEBHOOK"
ret = requests.post(webhook, json=event["Records"][0])
if ret.status_code != 200:
return "Bad status code for webhook: %d" % ret.status_code
print("Status code: %d\nData: %s" % (ret.status_code, ret.text))
3. Configure and test the workflow
To test the previous step, we have to upload a file to the chosen S3 bucket. If everything is connected correctly, this will trigger an event to come into Shuffle. If it doesn't, have a look at the Lambda function's logs, or contact us.
Now all we need is to do is actually download the file. This requires access rights to the bucket itself from the API in use. The app we use for this is the "AWS S3" app, with the action "Download file from bucket". See the next image for how to configure this action after authenticating it. (We may add file uploads to webhooks in the future).
And that's it! All file updates should now come into Shuffle, including a way to download the file.
4. Extra: Scan the file with Yara
In a lot of cases you would want to analyze the files somehow. But how..? Shuffle has a built in way to run Yara on files, and get results based on built-in rules (or your own). Most of the time, you would also want a response to that search, hence we've created a way to also delete the file if it matches too many Yara rules.
To do this, add in the "Yara" node with the action "Analyze file", and put "$get_s3_file.file_id" in the "File Id" field. This should match the file downloaded from the bucket.
After this, add another S3 node, and choose the action "Delete file from bucket" with these parameters:
Last but not least, we need a way to see IF the file should be deleted. This can be done by clicking the branch between Yara and the S3 delete action > "New Condition" > with the following data:
{{ $run_yara.matches | size }} > 3
Done! Whenever a file is downloaded, it will be analyzed by Yara, checking matches, then removing the file if it's more than 3.
QRadar send offense webhook to Shuffle.
Bash Code:
#!/bin/bash # Version 1.0.0
shuffle_url=$1 api_token=$2 offense_id=${3%.*}
auth_header="SAuthorization:$api_token"
output=$(curl --insecure -H $auth_header $shuffle_url -d "$(cat <<EOF { "offense_id": $offense_id } EOF )")
What you should do to use it:
1. Log In to QRadar;
2. Go to Admin > Custom Actions > **Define Actions**;
3. Click **Add**;
4. Fill the Basic Information (name & description);
5. In the Script Configuration set **bash** as interpreter, and import the bash script attached to this message;
6.** For the Script Parameters, add the parameters in the following order:**
7. Save
8. Deploy Changes


#### STEP 2: Create new offense alert rule
1. Go to Offenses > **Rules**;
2. Click Actions > **New Event Rule**:
Rule Description
Apply Shuffle new offense alert on events which are detected by the Local system
and when the event QID is one of the following (28250369) Offense Created
Rule Responses
Execute Custom Action QRadar to Shuffle (previous added script)
This Rule will be: Enabled
#### QRadar Overview
- Easy, this way each time a new offense dispatches in QRadar it will send a webhook to Shuffle containing the offense_id, then the webhook node will receive this info and pass it to the next node (QRadar App) that will perform a **get offense** ~~data~~ action using the received offense id as key.
This way you won't need to execute an api call every x time and save the last offense id. This is a better solution and improves the SLA, coz if you set 1 minute as you x time, you may have 1 minute delay or less, besides you are getting all "ungotten" offenses at once, when you can use shuffle to handle with multiple at the same time if it happen to dispatch more than one offense in QRadar at the same time or with just some seconds of difference.
### FortiSIEM
FortiSiEM is the SIEM of Fortigate. It has the possibility of notifying Shuffle through a webhook when a rule triggers, which is exatly what this documentation section is for. The main caveat: all data is XML and needs to be transformed with the Shuffle Tools "XML to JSON" formatter.
**1. Create a Workflow which will receive alerts**
This one is pretty easily explained. Go to Shuffle an make a new Workflow.
**2. Add a Webhook to the workflow**
[Add a webhook](/docs/triggers#webhook) and get the Webhook URL. Remember to start the Webhook!

Copy the URL and keep it for the next steps

**3. Configure FortiSIEM forwarding**
Log into the UI. When inside, go to ADMIN > Settings > Incident Notification. In the "Incident HTTP Notification", paste in the webhook from the previous step. Click "save", then "Test". After the test, check the workflow in Shuffle whether it triggered. If it didn't, your FortiSIEM can't access the Shuffle instance.

With the Notification Endpoint specified in the previous step, we need to decide what rules to add. By default, we add all of them. To do this, go to ADMIN > Settings > Notification Policy, and add a new policy. In here, select the "Send XML file over HTTP(S) to the destination set in...". This will make sure all alerts are sent to Shuffle.

That's it! It's now time to wait for an alert to actually trigger. When it has, make sure to send Execution Argument from the webhook ($exec) straight into an XML to JSON parser. That way you can use it easily in Shuffle.
### Splunk SIEM
Splunk is a SIEM tool for security operations. There are multiple ways to forward the Splunk alerts to external systems. Simplest way to forward splunk alerts to Shuffle is with using webhook.
**Step 1:** First we'll have to create a Shuffle workflow which will recieve alerts from Splunk. Go to /workflows in Shuffle and create a new worfklow. Then, inside workflow editor drag in the webhook from the trigger section in left pane.

**Step 2:** Click on the webhook node and then click on start. Copy the webhook url.

**Now that we have webhook running in the Shuffle, Go to your Splunk deployment server and log in.**
**Step 3:** Now we'll have to configure Splunk with the webhook URL. Once logged in, go to the **search and reporting** app and type in the query you want to create an alert for.

**Step 4:** Save the search query as an alert.

**Step 5:** Fill out all the form details for saving as an alert. At the very bottom of thr form in **Trigger action** section click on **Add actions** select Webhook.

**Step 6:** Paste in the webhook URL in URL field and click Save.

**You should now start seeing data sent from Splunk into Shuffle which can be used inside workflow for further actions.**
### Eventlog Analyzer
Note: API integration is unfortunately not supported with EventLog Analyzer. However, If you would like to forward logs from EventLog Analyzer to Shuffle. For more information please follow this [guide](https://www.manageengine.com/products/eventlog/help/StandaloneManagedServer-UserGuide/Configurations/log-forwarder.html)
### ServicePilot SIEM
ServicePilot is a high-performance analytics platform that supports observability and full-stack monitoring: metrics, traces and logs. You can collect data from many services and sources across your entire IT stack (ITIM, NPM, APM, DEM, SIEM) as well as view details of historical data stored by ServicePilot. Webhook integration service is not provided by the ServicePilot platform. Here am gonna mentioned the steps for how to use ServicePilot app workflow.
#### - What this workflow do?
This workflow is help to get all the alerts, objects, events and logs. In this workflow there are two nodes of ServicePilot app. one is for getting all the records from the ServicePilot SIEM app and another one is searching for alerts, objects, events and logs for specific entry. As a result this workflow first excecute the Get_All_Data node and after that it excecute the Get_Specific_Record node and at the last return the result.
#### - How the seaching happens?
For searching records from the ServicePilot app we need to write SQL query to search the data. we need to write query as a query parameter in the request url.
#### - Steps to create a workflow
**Step 1:** First we'll have to create a Shuffle workflow which will search for events, objects, alerts, and logs from ServicePilot. Go to /workflows in Shuffle and create a new worfklow. Then, inside workflow editor drag in the node for the ServicePilot.

**Step 2:** Click on the ServicePilot node.

**Step 3:** Change the name of the node as Get_All_Data.

**Step 4:** Write Query for searching all the datas from the ServicePilot.

**Step 5:** Inside workflow editor drag in the another node for the ServicePilot and connect it with the existing node.

**Step 6:** Change the name of the node as Get_Specific_Record.

**Step 7:** Write Query for searching specific record from the ServicePilot.

**Finally, Click the excecution button and you should now start seeing data sent from ServicePilot into Shuffle which can be used inside workflow for further actions.**
### ELK
TBD: Kibana forwarding & ElastAlert
### Cortex
TBD: Responder executions