By Jennifer Bantelman
Understanding all of the systems being used across your company has been a growing challenge for many years, with many companies just starting to get a handle managing a data map of all the applications that are in use.
Now, in light of the “new normal” of distributed teams, it’s time to reassess. Did you have to add virtual access systems for sensitive on premises data? What collaboration and conferencing tools have had to be added?
In July’s PREX Summit Series webcast on Assessing Your Data Maps in a Newly Distributed World, our expert panel set out to answer these questions and more.
This session’s panelists were Deanna Blomquist, Sr Manager or IT-eDiscovery at Dish Network; Charisse Fletcher, Assistant VP, Discovery Operations at JPMorgan Chase; and Ryan Zilm, Director of Information Lifecycle Mgmt at USAA. The session was moderated by Dan Nichols, Partner at Redgrave LLP.
There are several components of a good data map. Some of the key components to include are:
When considering locations of information and the risk profile, it is important to consider that when moving from on premises systems to systems that are managed by 3rd parties, cloud based systems often mean that, while you will have less maintenance burden, you probably won’t have a lot of control over export formats. Understand what the system does by default, and what can be configured.
With COVID-19 moving many businesses into semi-permanent working from home arrangements, there has been a significant increase in Zoom, Google, Teams, and other online real-time video, collaboration and live chat sessions.
As part of your data mapping, it is critical to consider what systems have been rolled out recently due to this additional usage. Make sure these are reflected in your data map, and that you understand the functionality, especially around recordings of video, audio, and screen sharing, locations of the data, and any retention schedules.
Remember that not all data is equal. Consider including an information risk matrix to identify the most critical systems, and start with those to drive risks down. Technology can assist with this; systems can go out and ping information and bring it back so that you understand the data that you are capturing.
It is also important to consider the types of data, as well as any limitations. Yes, most of your data may be ESI, but don’t forget about legacy systems and any special considerations. This may include physical documents, and bear in mind that you may also have documents or systems that degrade over time.
Obviously, there are a lot of considerations in putting together your data map, and keeping it updated is a constant challenge. In order to do it in a way that is going to make sense for your organization, consider the reason or how you might leverage your data map
Map the who: consider not just who the data map might be for but also who owns the system from a business and technical perspective. This should be a cross-functional effort and will need periodic reassessment. Often people change more frequently than the systems with which they work.
Consider surveying system owners and potential system owners to get a better overview of your data, and learn about new systems that might not be enterprise wide. Challenge the information you get back, because filling out these surveys as a data steward probably isn’t their day jobs, so ask probing questions to validate the responses you get back. Leverage your stakeholders, and don’t forget to support them as well.
Map the what: structured data and unstructured data may be managed and accessed very differently, as can Cloud and On Premises systems. When you understand the “what” this will guide your reassessment schedule to make sure the data map stays fresh. Consider as well what protections you may have in contracts with third parties who host your data.
Overall, it’s important to begin with the purpose in mind, so that your data map is built for the need at hand, and you avoid both over-engineering and missing critical components. Start with the systems that have the highest risk or regulatory burdens on them, and remember to reassess on an ongoing basis.
Jennifer is a technologist focused on strategy, customer experience, workflow, process improvement, and product in the legal tech industry. She has worked in software and technology for over a decade, and holds an M.A. in Strategic Communication. She currently leads Solutions Engineering at Zapproved, where she ensures product feature functionality and technical capabilities are designed and implemented in ways that solve real world problems. Jennifer is a speaker and content contributor on a variety of technology, data preservation and ediscovery issues, and is the Chair of the PREX Conference.