Dativa Pipeline API: Working with session data

Dativa Pipeline API: Working with session data

Previous: Handling invalid data | Next: Reporting and monitoring data quality

As session data is common in IOT applications, we have a special rule to validate session data, removing overlapping sessions and filling any gaps. rule_type="Session" provides special functionality to handle session data:

  • key_field - specifies the field that keys the session. For most IOT applications is would be th device ID. If two sessions overlap on the same value of key_field then they will be truncated. Defaults to None
  • start_field - specifies the field that controls the start of the session. Defaults to None
  • end_field - specifies the field that controls the end of the session. Defaults to None
  • date_format - defaults to '%Y-%m-%d %H:%M:%S'
  • overlaps_option - specifies how overlaps should be handled:
    • "ignore" - overlaps are not processed
    • "truncate_start" - the overlap is resolved by truncating the start of the next session
    • "truncate_end" - the overlap is resolved by truncating the end of the previous session
  • gaps_option - specifies how overlaps should be handled:
    • "ignore" - gaps are ignored
    • "extend_start" - the gaps are resolved by extending the start of the next session
    • "extend_end" - the gaps are resolved by extending the end of the previous session
    • "insert_new" - the gaps are resolved by inserting a new session as specified in the "template_for_new" parameter
  • template_for_new - contains a comma separated list of values that will be used a a template for a new row in the file to fill the gap. The key_field, start_field, and end_field will be replaced with appropriate vlaues to fill any gaps.
  • allowed_gap_seconds - specifies how many seconds of a gap are allowed before the gap options are implemented, defaults to 1
  • allowed_overlap_seconds - specifies how many seconds of overlap are allowed before the overlap options are implemented, defaults to 1
  • remove_zero_length - specifies whether zero length sessions should be removed. defaults to True

Previous: Handling invalid data | Next: Reporting and monitoring data quality

Related documentation

  • Dativa Pipeline API on AWS - The Dativa Pipeline API is available through the AWS marketplace (more)
  • Dativa Pipeline Python Client - Dativa Tools includes a client for the Pipeline API (more)
  • Dativa Pipeline API: Sample Data - Sample files to demonstrate usage of the Dativa Pipeline API (more)
  • Dativa Pipeline API: Validating basic data types - Validating incoming datasets for basic string, number, and date type formatting and range checks using the Dativa Data Pipeline API (more)
  • Dativa Pipeline API: Anonymizing data - The Dativa Pipeline API support tokenization, hashing, and encyrption of incoming datasets for anonymisation and pseudonymization (more)
  • Dativa Pipeline API: Referential Integrity - Using the Dativa Pipeline API to validate data against other known good datasets to ensure referential integrity (more)
  • Dativa Pipeline API: Handling invalid data - Invalid data can be quarantined or automatically fixed by the Dativa Data Pipeline API (more)
  • Dativa Pipeline API: Reporting and monitoring data quality - The Dativa Pipeline API logs data that does not meet the defined rules and quarantines bad data (more)
  • Dativa Pipeline API: Full API reference - A field by field breakdown of the full functionality of the Dativa Data Pipeline API (more)

Need help? Get in touch...

Sign up below and one of our data consultants will get right back to you


Dativa is a global consulting firm providing data consulting and engineering services to companies that want to build and implement strategies to put data to work. We work with primary data generators, businesses harvesting their own internal data, data-centric service providers, data brokers, agencies, media buyers and media sellers.

145 Marina Boulevard
San Rafael
California
94901

Registered in Delaware

Thames Tower
Station Road
Reading
RG1 1LX

Registered in England & Wales, number 10202531