Step By Step Implementation Guide
Data Sources Implementation
Get credentials
- Request for access or find credentials to data source in 1Password vault dedicated for the client
- Request for URL and password to the monitoring account of client's instance from #help_it
- Confirm that the credentials are valid ASAP after received
- Confirm access to data sources match with the defined scope of work (that we are given accesses / credentials to all the data sources in scope and none of them are out of scope)
- Request for sample of data (Whether the access is available or not) - this should give the analyst a good overview of what to expect
Initial data exploration
- Prepare data sample description following the Data Exploration checklist; paying attention to:
- List of fields
- How sparse the data is
- Data values sample (SELECT DISTINCT something)
- Distribution
- Volume
- Prepare customer events description which includes the following. Also refer to How to Format Events
- Types of events (eg: purchase, shipment, delivery, cancellation)
- Example of payload
- Prepare a list of unique IDs from all events
- Compare this against the identity graph defined with the client. For each data source's events, we should see the same types of unique IDs
- Check if there is any kind of data validation necessary article on unique ids data validation?
- Check the format of PS identifiers across all data sources:
- "None", "0", "NULL" in text format can result in stitching huge customer entities
- Is the format of the identifier the same across all data sources? "12345" and "12345.0" are different values for PS algorithm
- Are there placeholder values which should be removed?
Data source implementation
Implement the workspace:
- Extract
- Clean (exclude events we don’t need, remove unnecessary fields to remove, clean the values)
- Validate the Profile Stitching IDs
- Confirm the valid version/branch of format_events and cdp_db_loader components
Document:Write in Documentation:5.1.Generaldescriptiondescription,+business assumptions of the workspace5.2.Comments in configs
Please refer to the following documentations:
- Template for Workspace Documentation
-
Meiro Events: How to Implement Meiro Events in MI and CDP and set up alerts .
- How to Format events into format suitable for customer_events table
- How to Load events into CDP
Review, Finalize
- Code review - Assign a senior analyst to do code review
- Final checks:
DoubleDouble-check through queries- Benchmarks with source data (eg. validate total web visits against GA numbers or total email subscribers with source. This is a quick way to identify if there are significant gaps in our data ingestion.
Profile Stitching
Define profile stitching configuration
Run PS workspace
Check the quality of Profile Stitching after running.
In each project, there is always the potential to have edge cases where we wrongly stitched some customer profiles or create super entities. Refer to this guide for cleaning wrongly stitched customer entities without re-running PS, which can be a time and resource-consuming process.
If the situation calls for it you can re-define the PS configuration and re-run profile stitching from scratch.
Attributes
In the business requirement gathering process, the PM should have defined a list of required attributes from the client following this Attributes Library template.
Based on this list, analystanalysts should first verify all the attributes requested that the data or events for them are available,available and possible to calculate. Double-check on the formula we should be using as well.
Set first & last name as the first two attributes in the customer profile
Implement SQL queries for calculation & run CPS: Attributes Calculation configuration
Data Destination
- Before configuring data destinations, ensure that you have the relevant access/credentials to the destination in 1Password
- OAuth repo
- Prepare OAuth repo, if relevant
- Confirm that OAuth repo is ready and the loader component can be authorized
- Implementation
- Set up the connection between MI and CDP (create admin “users” for API connection, test)
- Check what are the requirements for the data in the destination - depends on the destination, certain IDs need to be set as exported attributes
- Prepare exported attributes
PrepareSet export destination in MI andtest the workspaceMBETest- Set up alerts for: MI Workspace and Meiro Events monitoring dashboard. This is crucial for making sure that we are always aware of failed workspaces and issues on implementation before the impact is noticed by the end user of the CDP.
- The
Analystanalyst team will take turns to be on duty monitoring the alerts
- The
- Test segment export with users - send a non-empty segment to the customer, confirm that the end destination
receivereceives the segmentexportexport.
Final CDP and BE Settings
Project Managers need to set up the Business Explorer instance
Welcome Emails & Product Newsletters
UAT