A Data Hook® is a short snippet of code that allows you to validate, manipulate, and format data as it is being uploaded.
Data Hooks are built out in our Platform SDK, and can take the form of field hooks (run on specific fields) and record hooks (run on an entire record). For an overview of how to build out Data Hooks, see our developer documentation, and for some examples of how to build out Data Hooks and sheets, see the examples section of our Platform SDK template.
All data hooks built in the Platform SDK for Portal 3.0 and Workspaces run in the order of: cast, then compute, then recordCompute, then batchRecordsCompute, then validate. Any hook that will need to rely on another hook completing before it runs should be built with this order in mind.
We highly recommend using the testing utility provided with the Flatfile Platform SDK to ensure your hooks are behaving as expected before deploying your sheets to production. Additional tests can be done after your sheets are deployed, but testing in code first will enable you to easily confirm that your Data Hook logic works as expected.
In Portal 3.0 and Workspaces, all data hooks run after the mapping step and before the review stage, so their place in the data import workflow should be taken into consideration while building them out.