websights

Beyond Page Breaks: Proofing Digital Content

shutterstock 124002364

At Inkling, we believe there’s no reason for digital content to be any less meticulously crafted than the print version, and yet, oftentimes, it can be enormously difficult to maintain that same high level of quality across all digital products and outputs. In this series, written in collaboration by a team at Inkling dedicated to solving that question, we’re exploring the concept of content quality. This post digs into one of the most important steps for ensuring content quality: proofing. We share two methods we’ve developed for proofing digital content. 

Proofing is often a thankless task. A great proofing job is inherently invisible to the reader; only when a mistakes slips through into the finished product does it become apparent how necessary, rigorous and thorough proofing is.

Ask a copyeditor how to best proof content and they’ll share an efficient system that tackles technical and complex content. Yet, even the most foolproof system designed for fixed-layout content is bound to break when it comes to proofing digital content. For example, a chart that looks perfect in its position on page 34 might break when seen on a Kindle or iPhone. When you’re testing content across a variety of devices and formats, how should your approach to proofing change and adapt?

At Inkling, we’ve been tackling this exact question for years, and through a lot of trial and error and close partnerships with vendors and publishers, we’ve created a new system of proofing digital content that ensures content quality, regardless of device. While this post gets fairly technical and into the weeds of digital content proofing, we think it also illustrates why having a foundation in structured content authoring for any content project leads to dividends down the road.

Deep dives and layer cakes: two digital proofing methods

There are two methods of efficiently proofing a large piece of digital content. One is to review a representative subset of the content as meticulously as possible, identifying any errors or opportunities for improvement. Should the content be clean, the assumption is that the rest of the project is equally so, and the assumption works in the reverse as well–errors are assumed to be repetitive and should consequently be searched for across the whole of the project. We call this kind of deep dive “vertical proofing”.

The other method is to target specific patterns across the project in a more rigorous survey, as if looking at individual layers in a layer cake. For example, if there’s a problem with one sidebar, then you should look across the whole project to compare each sidebar against the next to ensure consistency. At Inkling, we believe in employing both methods, starting with a vertical proof and then finishing with horizontal proofs of any areas that proved problematic.

Vertical proofing

Due to the structured, rules-based nature of digital content, which is styled by classes that exist at a single source and is based off of a list of patterns that have been fully vetted in the production sample, completing a deep proof of an entire title would be largely redundant. Instead, we work off of the assumption that a well-built and functional pattern in one location should be consistent across the entire project. The goal of a vertical proof is not to touch every bit of the content but, rather, to do just the opposite–use a sample to know what is working and what is not.

The content selected for the vertical proof must contain as many pattern variations as possible. Complex layouts or functionality are also useful to include in a sample. The reviewer looks at this small subset as if it were the whole–a synecdoche of sorts. We call the proof “vertical” because we progress naturally, linearly through the content, and then take stock of our results in the end.

During the vertical proof, it is important to stress test the content as we once did with the production sample, to ensure that the patterns hold up and behave as expected. The reviewer should also revisit viewing the content in different displays and and orientations. While there are many ways to do this, we use a feature in Habitat that lets us, at the click of a button, see how content might render on a mobile device or a large web browser display, without investing in expensive hardware, creating multiple versions of a project, and downloading them separately onto a variety of machine. Investing in tools that simulate the dimensions and quirks of multiple devices might have an upfront time and engineering cost, but these tools can save content creators a lot of time, money, and headaches in the long run.

After completing a thorough audit of the content during the vertical proof, the reviewer will have an informed understanding of how to strategically go about auditing the rest of the title. Any errors may indicate a part of production that was less robust and therefore susceptible to repetitive errors throughout the title. In this case, a horizontal proof of that specific type of content is appropriate.

Horizontal proofing

Called a horizontal proof because it slices the content to analyze just a cross-section, the horizontal proof allows us to focus on some repetitive element that might need special attention. For example, if the vertical proof found consistent capitalization errors in the titles of sidebars, a horizontal proof might follow to check all sidebar titles for this issue. This could be done manually by scanning the content from one end to the other, checking only for the capitalization error. However, if the content is well-structured, there are other tools and methods that can the reviewer quickly find-or even fix-the error at a global level.

At Inkling, we embrace the fact that our content is made out of code. We will discuss the idea of content-as-code next week, but for now we’ll look at several tools we developed in-house, built upon some hard-earned expertise we gained while struggling to each issue on a case-by-case basis.

For example, footnotes and annotations have often bedeviled copyeditors. To help us deal with this issue, we developed a way to see all of the content located in footnotes or other annotations at a glance. Using this tool, the reviewer can get a quick overview, without having to click into each footnote one by one. It allows us to check for consistent formatting of text, links, images (if available) and ensure that annotations match the content to which they are appended. You can skim the content and look only for those issues specific to footnotes, without being distracted by other patterns or functionality.

We’ve also refined the search function in Habitat to allow us to search for very specific patterns. By searching for just one kind of pattern, we can check to see if each use of that pattern appears the same every time. As with the annotations, this view collects all of the similar pattern in one place for distraction-free reviewing, and  we can ensure that no significant difference exist among the same pattern. Crucially for non-static digital content, the search results are all fully functional and interactive–links work, images are present, and each pattern can link back to it original location, so any fixes can be made easily made.

Expanding beyond just a group of patterns, Inkling Habitat likewise includes the ability to see all of the content at a glance. This can help to identify any problem areas while looking quickly at the content, and then we can select any pattern and pull it up within search, as seen above. This means that if something grabs our eye, we can easily focus in on it–and all similar patterns–to address any potential problems.

Conclusion

We should end this with a caveat: all of our proofing assumes that the actual text–the letters and their punctuation, the grammar, the context, the wording that authors and editors pour over to make perfect–has already been achieved. The proofing we’re discussing is about functionality and layout for digital content, for finding typos introduce by programmed ingestion or scripted fixes. There is no substitute for a pair of human eyes doing a proof to ensure clean, well-written content.

Still, creating smart tools and continually improving on digital production methods will result in huge, sustainable wins for better content quality beyond the single title at hand. This is all made possible by the content’s structured code foundation. Rethinking how you proof digital content, diving deep on samples or looking at a single pattern across the whole, can yield highly functional, well-created content, while still remaining efficient during the digital content creation workflow.

What are some other ways that you and your teams have tackled proofing digital content? We’d love to hear how you do things and answer any questions in the Comments below!