Relativity & Implementation: Smoke Testing
To confirm that our Relativity instance is functioning properly after the initial installation, our team performed a ‘smoke test’ following the procedures documented by Relativity in Post-Installation Verification Test document using the sample data set in the downloadable Post-Installation Verification Test Data.zip. This 2-hour post-installation verification test outlines common tasks such as creating workspaces, working with the viewer, and importing data. Note that a smoke test is not a performance test since we are using a small sampling of documents, simple searches, and imaging sets to confirm basic functional operation. Performance testing by our QA team in the production environment is documented in our next blog.
Relativity recommends that smoke tests be performed after the initial installation and following any upgrades to identify that the installation is fully functional. The team performed the test procedures for each Relativity module in a specified sequence with provided sample data to ensure that the results for processing, analytics, or saved searches are identical between functioning environments. This facilitates any troubleshooting of unexpected results.
Before Starting
The team checked the following before starting the smoke test:
- Reviewed “What’s New” on the Infrastructure Page of the help page
- Checked all certifications on upgrades or newly added servers
- Clicked into loaded documents to check that both the viewer and extracted text are working
- Viewed the Relativity-provided Post-Installation Verification Test Tips video for tips on field setup, identification of dependencies, and related topics.
Field Setup
Many sections in the smoke test require creation of fields. To avoid the repeated process of doing a module and going back and forth to create a smoke field, our team followed the recommendation to make fields only at one time for choices, views, and saved searches. Our smoke test document has all the provided fields before we imported any documents. This enabled us to identify multiple fields with minor changes to test fields. The fields to be created included Smoke OCR Destination, Smoke Example, Smoke Object, and Email Thread Group.
After creating the smoke workspace, the team created new fields with objects including Object Type, Name, Field Type, Required, Import Behavior, Pane Icon, Relational View, Allow Group By, and Allow Pivot.
We downloaded the post-Installation Verification Test Data.zip file and created a field with all choices when smoke testing. The zip file contains a data set of over 2,500 files for running all the smoke tests. The team performed specified searches and processing sets for consistency with expected results.
Identifying Dependencies
Between different applications there may be required fields, layouts, searches, and indexes with interdependencies not explicitly tested but need to be understood for troubleshooting.
Parallelization
Multi-screen views enabled our team to work more efficiently on different parts of the smoke test while waiting for index creation, dtSearches, analytics, processing sets, and running productions.
Executing the Smoke Test
Our team executed all smoke test procedures documented in the Post-Installation Verification Test document including the following:
- Created workspace elements and a new field, new choices, layout, and markup set
- Created a new search and imaging profile
- Performed keyword searches. Created a new dtSearch index and verified that it is successful. Created and ran a production, a saved search, and a new production set
- Staged, ran, and exported the production and verified that the production deleted properly
- Created a new OCR set
- Used Relativity Analytics – created an index, used Compare, created a new cluster and installed Analytics and modified the profile, and created an active learning project
- Created and tested a view to display email threading
- Installed and ran Assisted Review
- Created a new batch set
- Created and ran a search terms report
Our team ran the telemetry smoke test script to verify that system usage metrics can be transmitted back to Relativity. The script creates and queries a test metric which is transmitted and purged with no results returned if configured properly. For additional information see Telemetry and metrics.
Next Blog: Performance Testing for Production
Be sure to check our next blog where our QA Team tested performance metrics for searching, security, deployment, and compliance with the stages of the Electronic Discovery Reference Model (EDRM).