Relativity & Implementation: Smoke Testing

To confirm that our Relativity instance is functioning properly after the initial installation, our team performed a ‘smoke test’ following the procedures documented by Relativity in Post-Installation Verification Test document using the sample data set in the downloadable Post-Installation Verification Test Data.zip.  This 2-hour post-installation verification test outlines common tasks such as creating workspaces, working with the viewer, and importing data.  Note that a smoke test is not a performance test since we are using a small sampling of documents, simple searches, and imaging sets to confirm basic functional operation.  Performance testing by our QA team in the production environment is documented in our next blog. 

Relativity recommends that smoke tests be performed after the initial installation and following any upgrades to identify that the installation is fully functional.  The team performed the test procedures for each Relativity module in a specified sequence with provided sample data to ensure that the results for processing, analytics, or saved searches are identical between functioning environments.  This facilitates any troubleshooting of unexpected results. 

Before Starting 

The team checked the following before starting the smoke test: 

  • Reviewed “What’s New” on the Infrastructure Page of the help page 
  • Checked all certifications on upgrades or newly added servers 
  • Clicked into loaded documents to check that both the viewer and extracted text are working 
  • Viewed the Relativity-provided Post-Installation Verification Test Tips video for tips on field setup, identification of dependencies, and related topics. 

Field Setup 

Many sections in the smoke test require creation of fields.  To avoid the repeated process of doing a module and going back and forth to create a smoke field, our team followed the recommendation to make fields only at one time for choices, views, and saved searches.  Our smoke test document has all the provided fields before we imported any documents. This enabled us to identify multiple fields with minor changes to test fields.  The fields to be created included Smoke OCR Destination, Smoke Example, Smoke Object, and Email Thread Group. 

After creating the smoke workspace, the team created new fields with objects including Object Type, Name, Field Type, Required, Import Behavior, Pane Icon, Relational View, Allow Group By, and Allow Pivot. 

We downloaded the post-Installation Verification Test Data.zip file and created a field with all choices when smoke testing. The zip file contains a data set of over 2,500 files for running all the smoke tests. The team performed specified searches and processing sets for consistency with expected results. 

Identifying Dependencies 

Between different applications there may be required fields, layouts, searches, and indexes with interdependencies not explicitly tested but need to be understood for troubleshooting. 

Parallelization 

Multi-screen views enabled our team to work more efficiently on different parts of the smoke test while waiting for index creation, dtSearches, analytics, processing sets, and running productions.  

Executing the Smoke Test 

Our team executed all smoke test procedures documented in the Post-Installation Verification Test document including the following: 

  • Created workspace elements and a new field, new choices, layout, and markup set 
  • Created a new search and imaging profile 
  • Performed keyword searches. Created a new dtSearch index and verified that it is successful.  Created and ran a production, a saved search, and a new production set 
  • Staged, ran, and exported the production and verified that the production deleted properly 
  •  Created a new OCR set 
  • Used Relativity Analytics – created an index, used Compare, created a new cluster and installed Analytics and modified the profile, and created an active learning project 
  • Created and tested a view to display email threading 
  • Installed and ran Assisted Review 
  • Created a new batch set 
  • Created and ran a search terms report 

Telemetry Smoke Test 

Our team ran the telemetry smoke test script to verify that system usage metrics can be transmitted back to Relativity.  The script creates and queries a test metric which is transmitted and purged with no results returned if configured properly.  For additional information see Telemetry and metrics. 

Next Blog: Performance Testing for Production 

Be sure to check our next blog where our QA Team tested performance metrics for searching, security, deployment, and compliance with the stages of the Electronic Discovery Reference Model (EDRM).  

Relativity Certification & Implementation: Workflows & Enhancements

Identifying potential areas for automated workflows and enhancements with your stakeholders will give you more time to perform higher valuable tasks like using analytical features. Unlock automation within your workspaces by utilizing Relativities automated workflows and spend less time performing repetitive tasks and busywork. Relativity also provides several out-of-the-box options, or customized workflows for more advanced users. Identify areas of potential automation with your stakeholders and take advantage of the available workflows.  

Automated Workflows 

Unlock automated workflows within your workspaces to automate manual processes, such as matter setup.  Use automated workflows to: 

  • Streamline the administrative tasks of a matter by automatically running setup operations. 
  • Automatically update a Search Term Report each time new documents are added to a case. 
  • Achieve consistency with end user experience by standardizing automated workflows across matters and built into the Relativity Workspace templates. 

 

Additional information about Automated Workflows can be found here: https://help.relativity.com/RelativityOne/Content/Relativity/Automated_Workflows.htm 

 

Relativity Scripts 

Use Relativity scripts as another means to customize and enhance Relativity functionality. 

Consult with your stakeholders to identify the Relativity scripts that meet your requirements from the full list in the Relativity Script Library or the most commonly used scripts listed below: 

 

  1. Collect Folder Path Data – Attain the document count and file sizes for each folder across all workspaces in an instance.
  2. Set Duplicate Flag Field – Identify and set a Yes/No field on all case documents to indicate them as original or duplicate.
  3. Date Field Parsing – Parse the date and time portions of a selected date field and write them to two separate fields for documents contained within a selected Saved Search.
  4. Delete Empty Case Folders – Delete any empty folders in a workspace where the folder and all of its subfolders contain no documents.
  5. Environment Level User Login – Display a report of users’ logins to Relativity and the date and time they accessed each workspace they have access to
  6. Processing Statistics – Generate a report of processed data sizes per processing set and users in all workspaces in the environment. 
  7. Case Permission Audit Report – Generate a report that lists all users, workspaces, and groups, as well as each group’s permissions. This report helps system admins manage their environment’s permissions.
  8. Reviewer Statistics – Report on the efficiency of reviewers over a specified date range. The returned statistics provide a count on the documents reviewed over a certain period.
  9. Propagate Sent Date to Family Documents – Set all email families documents to the same sent date as their parent documents in a case to enable chronological sorting within all documents. 

Processing Error Handling and Workflow 

To increase the efficiency and turnaround time for processing inevitable errors, implement a proactive approach as follows: 

 

  1. Create a defined processing error workflow. 
  2. Decide how many attempts to execute on specific data types (file types, encrypted/decrypted, etc.) and subsequent actions if a file still results in an error.  
  3. Review the generated reports to identify any unresolvable files and help stakeholders determine a final approach to the unresolvable files. 
  4. Use the Error Actions console to retry or ignore errors. 

Next Blog: Smoke Testing 

Be sure to check our next blog for the smoke testing our team performed to validate that Relativity is working properly in the post-installation production environment. 

Relativity Certification & Implementation: Requirements & Key Metrics

To achieve consistent and reliable operational procedures, develop well-defined requirements and define workflows while autonomous tasks such as data migration are occurring early in the project. Consider the following to help you and your organization define consistent standards for operational procedures. 

BAU Requirements 

To realize a successful Relativity implementation, ensure previously gathered requirements are met and fulfilled during the entire implementation phase.  To attain organizational objectives, confirm that requirements are properly identified, defined, and scored during the planning phase.  Key requirement categories are Business, Project, Technical, Security, and Information Governance.  

Password Bank 

Use Relativity’s Password Bank to decrypt certain password-protected files during inventory, discovery, and basic and native imaging.  Create a password bank to enable Relativity to run passwords against each encrypted document until it finds a match.  This reduces the number of errors in each job and eliminates the need to address password errors outside of Relativity.  Avoid creating excessively large password banks that result in more matching attempts and extend the processing job time.  Instead, consider creating multiple password banks that are general to specific matters or types of investigations (compliance, M&A, employment, etc.). 

Reporting 

Review the feedback from Relativity reports to assess the progress and results of a job and make real-time adjustments to a project or determine if a job needs to be repeated.  Identify your organization’s key performance metrics for processing and define the operational procedures to automate reporting as much as possible. 

Optimize your processing and review workflows with common report types listed below: 

  • Pivot – Quickly analyze your workspace data to identify trends or patterns. Use Pivot to summarize data in tables or charts, simplifying the analysis process. 
  • Batch Summary Report – Use this report to displays the status of all batches (In Progress/Completed/Pending) and show the Review status of the identified term. 
  • Processing Reports – Choose from several processing reports, such as Discovery Reporting, Data Migration Reporting, Inventory Summary, Job Exception, Text exception, etc. 
  • Search Terms Report – Use this report to identify documents containing hits to specific keywords or terms. 
  • Analytics Reports – Select from several analytics reports including Population Statistics, Index Statistics, and Detailed Status. 
  • Assisted Review Reports – Choose from several analytics reports, such as Round Summary, Rank Distribution, Project Summary, Control Set Documents, etc. 

Be sure to check our next blog for insights and recommendations on implementing automated workflows, using scripts, and processing error handling.  Learn how you can implement strategies to streamline administrative tasks, increase efficiency, enhance Relativity functionality, and achieve a consistent end user experience.