How We Use InfluxDB for Security Monitoring

Our inventory of security events at InfluxData

Patterns we’re looking for

  • Total number of unique accounts
  • Total number of authentication attempts
  • Total number of successful authentication attempts
  • Total number of unsuccessful auth attempts
  • Average number of IP addresses per account
  • Average number of accounts per IP address
  • List of authentication events, with time, username, app, IP address, and whether successful or not

Authentication events

Data collection

Data storage

Data model

  • Authentication timestamp
  • Company account ID
  • Username
  • User ID
  • User domain
  • Authentication type
  • Authentication result
  • time: GWs.id.time
  • service_source: “G Suite”
  • service_domain: “influxdata.com”
  • source_address: GWs.ipAddress
  • email_address: GWs.actor.email
  • saas_account_id: GWs.actor.profileId
  • customer_id: GWs.id.customerId
  • application: GWs.id.applicationName
  • auth_results: GWs.events[X].name
  • login_type: GWs.events[X].parameters[Y].value

Visualization

Dashboard elements

Unique accounts

from(bucket: v.bucket)
|> range(start: v.timeRangeStart, stop: v.timeRangeStop)
|> filter(fn: (r) =>
r._measurement == "auth_activity"
and r._field == "auth_result"
)
|> keep(columns:["email_address"])
|> group()
|> unique(column: "email_address")
|> count(column: "email_address")

Authentication attempts

from(bucket: v.bucket)
|> range(start: v.timeRangeStart, stop: v.timeRangeStop)
|> filter(fn: (r) =>
r._measurement == "auth_activity"
and r._field == "auth_result"
and (r._value == "login_success" or r._value == "login_failure")
)
|> keep(columns:["_time","email_address"])
|> group()
|> count(column: "email_address")

Successful authentications

from(bucket: v.bucket)
|> range(start: v.timeRangeStart, stop: v.timeRangeStop)
|> filter(fn: (r) =>
r._measurement == "auth_activity"
and r._value == "login_success"
)
|> group()
|> count()

Failed authentications

from(bucket: v.bucket)
|> range(start: v.timeRangeStart, stop: v.timeRangeStop)
|> filter(fn: (r) =>
r._measurement == "auth_activity"
and r._value == "login_failure"
)
|> group()
|> count()

Average address cardinality per account

from(bucket: v.bucket)
|> range(start: v.timeRangeStart, stop: v.timeRangeStop)
|> filter(fn: (r) =>
r._measurement == "auth_activity"
and r._field == "auth_result"
)
|> keep(columns:["email_address","source_address"])
|> group(columns: ["email_address"])
|> unique(column: "source_address")
|> count(column: "source_address")
|> group()
|> mean(column: "source_address")

Total account cardinality per address

addresses = from(bucket: v.bucket)
|> range(start: v.timeRangeStart, stop: v.timeRangeStop)
|> filter(fn: (r) =>
r._measurement == "auth_activity"
and r._field == "auth_result"
)
|> keep(columns:["source_address"])
|> map(fn: (r) => ({ r with field: "x1" }))
|> group(columns:["field"])
|> rename(columns: {source_address: "_value"})
|> unique()
|> count()

accounts = from(bucket: v.bucket)
|> range(start: v.timeRangeStart, stop: v.timeRangeStop)
|> filter(fn: (r) =>
r._measurement == "auth_activity"
and r._field == "auth_result"
)
|> keep(columns:["email_address"])
|> map(fn: (r) => ({ r with field: "x1" }))
|> group(columns:["field"])
|> rename(columns: {email_address: "_value"})
|> unique()
|> count()

join(tables: { d1: addresses, d2: accounts }, on: ["field"])
|> map(fn: (r) => ({
r with _value: float(v: r._value_d1) / float(v: r._value_d2)
}))
|> keep(columns:["_value"])

Authentication results

from(bucket: v.bucket)
|> range(start: v.timeRangeStart, stop: v.timeRangeStop)
|> filter(fn: (r) =>
r._measurement == "auth_activity"
and (r._field == "auth_result")
)
|> keep(columns: ["_start","_stop","_time","_value"])
|> map(fn: (r) => ({ r with res: r._value }))
|> group(columns: ["res"])
|> aggregateWindow(every: v.windowPeriod, fn: count )

Latest authentication events

from(bucket: v.bucket)
|> range(start: v.timeRangeStart, stop: v.timeRangeStop)
|> filter( fn: (r) =>
r._measurement == "auth_activity"
and r._field == "auth_result"
)
|> duplicate(column: "_value", as: "auth_result")
|> drop(columns:[
"_start","_stop","_field","_measurement","application",
"customer_id", "service_source","saas_account_id","_value",
"service_domain"
])
|> group()
|> sort(columns:["_time"], desc: true)

A request to our fellow cloud software vendors

  • Access: Who (attempted) log in, at what time, and from where, in the form of IP address or fully-qualified domain name (FQDN). Even better: determine the latitude and longitude of a login. This way a customer can compute the distance between login sessions to see if they indicate an account compromise.
  • Usage: How long someone’s session lasted.
  • Activity: This is domain-specific and should allow tracking of at least add, change, and delete operations in an application or cloud service.

More on InfluxData and security

Conclusion

--

--

--

Occasional thoughts on tech, sailing, and San Francisco

Love podcasts or audiobooks? Learn on the go with our new app.

Recommended from Medium

360 view of XSS from the trenches

Network Threat Hunting — The Quiethub case

GDPR: How to write a Privacy Notice — Best Practices

{UPDATE} R.B.I. Baseball 18 Hack Free Resources Generator

Another phishing email, why should you care?

Phishing scam email purporting to be from PayPal hoping you’ll be fooled and give up a password or other sensitive personal information.

Stop Doing Nothing and Get Your Cybersecurity in Order

IObit Uninstaller Pro Crack 11.4.0.2 + Torrent & Full (Key) Free Download

The Fight to Secure Vulnerable Medical Devices From Hackers

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store
Al Sargent

Al Sargent

Occasional thoughts on tech, sailing, and San Francisco

More from Medium

Intel Dev Cloud access via SSH for oneAPI

Protect sensitive text using steganography

HTB Shocker Walkthrough

Deepnote: How to set up a PostgreSQL database in the cloud and connect