Join us for lunch!

Our second Lunch & Learn is happening October 28th. Industry thought leaders and publishing executives will discuss innovation, trends, and publishing solutions that they deem relevant. The goal is that everyone will leave with tangible take-aways to implement in their publishing programs.

You asked and we deliver our second lunch topic will discuss...

MathML: WORKFLOW TIPS FROM THE MATH OBSESSED

Lunch: October 28, 2015 | 12:30 pm to 2:00 pm ET
Location: Vermilion Restaurant, 1120 King Street, Alexandria, VA

Authors who write in LaTeX can be obsessed with the detailed layout and spacing of their equations and be unhappy when their TeX is converted and reformatted by the typesetting system's math engine. This has been a problem for 20+ years! In this Cenveo Publisher Services' Lunch & Learn Series, we'll hear from the math-obsessed and share workflow tips that have served them well in their publications.

Evan Owens, VP Publishing Technologies at Cenveo Publisher Services will lead the discussion with insight from you and other scholarly publishing executives. Debbie McClanahan, VP Publishing Services, will moderate the program.

This event is invitation-only and you are welcome to request more information by clicking the link below!


Posted
AuthorMarianne Calilhanna

Content wants to be found and consumed - but discoverability and accessibility doesn’t happen by accident. With the exponential proliferation of content and its many delivery methods, it’s more important than ever that publishers think deeply about content architecture and workflow processes that best support product strategy. 

Join us for this webinar as we explore content architectures and workflows that ensure content is more discoverable and accessible, whether publishing journals, books, or educational material.

Our publishing experts will share insight on:

  • The difference between good and valid XML
  • Why content accessibility mandates should not be an afterthought
  • Designing content architecture that meets the demands of a multi-platform, multi-device world.
  • Why filtered search is past its prime and how contextual and behavioral search will transform content discoverability

Panelists

  • George Kerscher, IDPF President and Secretary-General of the DAISY Consortium
  • Steven Heffner, Director of Product Strategy, Wolters Kluwer Health 

  • Kevin Burns, Senior Vice President, Content Solutions, Cenveo

October 22, 2015 | 2:00 pm to 3:00 pm

This is a free webinar produced by Book Business & Publishing Executive


Posted
AuthorMarianne Calilhanna

by guest writer, John Parsons

Since the early 19th century invention of braille, the concept of making written content available to the blind or visually impaired has been a noble aspiration of a civilized society. Making that concept a practical reality is another matter. Even as new, more automated technologies arise, the challenges of accessibility remain formidable.

Overview of documents and content required for the NIMAS fileset.

The rise of digital media has made the problem more acute since, like print, digital is an intensely visual medium. In his 2012 book Accessible EPUB3 (O’Reilly/Tools of Change), author Matt Garrish cites the phrase “digital famine,” meaning that only about 5% of books produced in a year are ever made available in an accessible format. “Although there are signs that this rate is beginning to tick upward with more ebooks being produced, the overall percentage of books that become available in accessible form remains abysmally small.”

By authoring or converting this digital source data to a structured, machine-readable format, publishers can output to multiple formats as a matter of economic feasibility and even profitability—not just because accessibility is a compliance mandate.

For K-12 and higher education, the accessibility gap has dire consequences.  However, accessibility can mean significantly different approaches. According to a recent report from the American Foundation for the Blind (AFB), not all those 21 and younger who are legally blind use the same reading medium. Only 9% use braille, while 29% are visual readers and 8% are auditory readers; alarmingly, 35% are non-readers. In other words, compensating for visual impairment can take many forms: tactile, auditory, and assisted or enhanced visual techniques for those with partial sight.

One source; many outcomes

Thankfully, these differences all point to a data-centric approach which can, in theory, resolve the accessibility issue for publishers. Words and images, particularly images with rich, descriptive metadata, are almost all inherently digital today. By authoring or converting this digital source data to a structured, machine-readable format, publishers can output to multiple formats as a matter of economic feasibility and even profitability—not just because accessibility is a compliance mandate.

According to the National Center for Accessible Educational Materials (AEM), there are four major specialized output formats for adapting printed instructional material to the diverse needs of the visually impaired. The first is braille, an alphabet of dot patterns that can be embossed on paper or rendered via a display device. Large print is self-explanatory—and theoretically most adaptable to ebooks and other digital display media. Audio—particularly the computerized text-to-speech variety—is third, followed by “digital text,” a general category encompassing any text and image descriptions that can be rendered by specialized or even general-purpose digital devices.

Since each of these four output choices follow predictable rules and logic, there is a definable way to use a structured “master file” approach—creating the content once, and outputting as needed to as many formats as the market requires, with a minimum of manual intervention.

Enter NIMAS

AEM is the developer of the National Instructional Materials Accessibility Standard or NIMAS (pronounced “nai-mass”), an XML-based specification for organizing and structuring textbook and other educational content. NIMAS is in turn a subset of an older XML standard known as Digital Accessible Information SYstem, or DAISY, used to create Digital Talking Books or DTBs. Books stored in NIMAS XML can be easily rendered in any of the four basic output formats, and made available to schools or programs for visually impaired.

In the U.S., schools receiving federal funding support are required to provide materials in NIMAS format, and to facilitate the resulting output formats for their students. Increasingly, publishers must meet that requirement, and are looking for ways not only to comply with the federal mandate but also to increase the output flexibility of their overall operations.

We can help

Interested in learning how Cenveo Publisher Services can help your publishing organization manage content conversion to NIMAS and generate NIMAS filesets for delivery to the National Instructional Materials Access Center? Just click the button below and let us show you how we make it easy to support all your readers.


Posted
AuthorMarianne Calilhanna

We are launching a new educational and interactive series for the scholarly publishing community: Cenveo Publisher Services - Lunch & Learn. 

While enjoying lunch, industry thought leaders and publishing executives will discuss innovation, trends, and publishing solutions that are important to them. The goal is that everyone will leave with tangible take-aways to implement in their publishing programs.

Following are details about the first lunch:

Launching and Sustaining a New Journal: Publication Models to Consider 

July 22, 2015 | 12:30 pm to 2:00 pm ET | Alexandria, VA

Don’t let anyone tell you there’s no such thing as a free lunch! The one ‘cost’ of this invitation-only event is that participants are required to engage with fellow publishing executives and share their thoughts and lessons learned. We have a great line-up of topics and are accepting ideas from the community for future discussions.
— Marianne Calilhanna, Director of Marketing

Once publishers finalize the business decision to start a new journal, more considerations arise regarding effective production processes that ensure long-term and fiscally rewarding sustainability

  • Should your organization adopt an Open Access model?
  • Does a continuous publication model make sense for your subject matter?
  • What tools are available that best support your authors?
  • What workflows are others considering?

Publishing consultant, Mike Sherlock from STM consulting firm Delta Think will detail first-hand experience he's encountered helping publishers navigate new journal publications. Insight from other scholarly publishing executives will share specifics on solutions they implemented as well as their roadmap for moving forward. Debbie McClanahan, VP Publishing Services, will moderate the program. 

This event is invitation-only and you are welcome to request more information by clicking the link below!


Posted
AuthorMarianne Calilhanna

Podcast from The Scholarly Kitchen

Stewart Wills, Editor and Content Director for Optics & Photonics News, published by The Optical Society, speaks with industry consultant Michael Clarke on The Scholarly Kitchen's podcast series. Listen to their great conversation about some of the growth engines—from new end-user products and services to new business models to mergers and acquisitions—that companies in scholarly communications are tapping as their traditional individual and institutional subscription businesses cope with flattening prospects. [Listen here!]

[Reposted with permission from The Scholarly Kitchen.]


Posted
AuthorMarianne Calilhanna

The Journal des Sçavans and the Philosophical Transactions of the Royal Society of London were first published 350 years ago (1665!) in France and England. Both publications were founded with the  intent to advance scientific knowledge by building on colleagues’ results and avoid duplication of results. Thus commenced the principles of scientific priority and peer review. Prior to these publications, scholarly communication transpired via written correspondence, society meetings, and books.  With this new publishing model, the scholarly journal allowed for a structured format and process to provide broad dissemination of knowledge combined with systematic recording and archiving of scientific findings and knowledge. 

Recent media coverage and even scientific articles debate the future of scholarly publication models and the publishers behind them. Wired recently published an article, The Web Will Either Kill Science Journals or Save Them. The Wired article is based on the paper published in PLOS ONE: The Oligopoly of Academic Publishers in the Digital Age.  Both articles provide an interesting read. This post is not intended to take a stance on best approach or practices to publish science but rather to look deeper at the core objectives related to scholarly publishing---dissemination of knowledge and long-term archiving of that knowledge. 

It's easy to say that "everyone can be a publisher in the digital age." And indeed open systems can be viewed as democratizing content but there are still costs and work involved if the core objectives of dissemination and archiving are to be achieved. People and systems need to ensure proper content architecture and applicable metadata that support distribution and archiving. Digital does not equal free. As print runs decline for scholarly journals, online distribution increases. The costs associated with online are often overlooked from a macro industry view. Somewhere a server needs to host that content and content is only useful if it can be searched and found. The devil in those details is complex and real business decisions and questions of economy come into play. Can we rely on crowd sourcing to truly vet science? Can a global supply of authors apply metadata and key terms that speak to broad international readers and researchers? Are systems "smart enough" to support content structure that ensures dissemination across content platforms? 

Time will tell. 

So while we sit back and toast the scholarly publishing community---its authors, publishers, researchers, reviewers, editors, and readers---on 350 years of profound change, I'm interested to hear your thoughts on whether the Internet will "kill science journals or save them."

 


Posted
AuthorMarianne Calilhanna

Reuters, the world's largest international multimedia news provider, announced it will offer a portion of Reuters multimedia news content for free to digital publishers through its publisher platform, Reuters Media Express.

With an intuitive search interface that allows registered users to drill down via keywords, media type, and date, news content is easily found and tools to share and embed content are one click away. 

This is the first time in more than 160 years that Reuters has offered content free of charge on this scale. From the official press release, Steven Schwartz, Global Managing Director, said

We are committed to innovating so that even more digital publishers and bloggers can take advantage of Reuters high-quality, relevant multimedia content to engage their audiences. As the industry transforms we are willing to disrupt traditional approaches to gain insights and help news publishing flourish, while also growing our reach and our business.
— http://inpublic.globenewswire.com/releaseDetails.faces?rId=1927434

My head is swirling with ideas of how this will help publishers of all types promote their publications (both print AND digital) and engage social involvement across publishing channels.  Demonstrating relevance of niche studies with media coverage in parallel current events can help amplify a publisher's voice. The variety of media assets (free of charge!) and the terms of use will complement websites and blogs. For example if I were a journal publisher that focussed on the field of astronomy and astrophysics, I might find the recent news of the Magellan Telescope construction an interesting piece to share on my blog:

Interested to hear others' thoughts on how Reuters Media Express might be used. Leave a comment below.


Posted
AuthorMarianne Calilhanna

Connecting Diverse Perspectives

Following is a collection of observations, tweets, and comments captured during last week's 37th Annual Meeting of the Society for Scholarly Publishing. Share something valuable you learned in the comments section!


Posted
AuthorMarianne Calilhanna

A Webinar for Publishers | May 21, 2015 11:00 am to 12:00 pm ET

Panelist: Evan Owens, VP Publishing Technologies, Cenveo Publisher Services

In any standards landscape, there is diversity and not likely to be a single or permanent solution. Because content often has to connect to bibliographic and business metadata, integration with other standards can be a requirement and a challenge. Additionally, markup language standards for content have multiple goals: text capture/rendition and metadata management.  Balancing these goals has historically been a design issue. This webinar featuring Evan Owens, VP of Publishing Technologies at Cenveo Publisher Services, explores how standards offer solutions to the problem and how book and journal publishers can benefit in terms of packaging, metadata, and cleanly articulated content components.

We’ll look at examples of CHORUS metadata issues to reinforce strategies your publishing organization should consider and answer any questions you have.


Posted
AuthorMarianne Calilhanna

The following content is excerpted from Publishers Weekly, Digital Solutions in India 2015: The 10th Annual Review. 

Convoluted RFPs (request for proposals), complex NDA (non-disclosure agreements), abstract briefings and impossible deadlines are nothing new to India-based vendors. In fact, they thrive on the difficult and challenging, as shown by the following projects (or abbreviated case studies) that fully encapsulate their capabilities and unique solutions.

Cenveo Publisher Services

"We helped launch a brand-new, Open Access, online-only journal from scratch in less than eight months," says marketing director Marianne Calilhanna, adding that Cenveo Publisher Suite with XSLT was used to produce standardized PDF and XML files for an early-release deliverable 24-hour post-acceptance. The workflow also delivered the final full-text XML, and copyedited and composed articles within 10 days of acceptance. "The latest JATS DTD with ORCID, FundRef and reviewer comments were included, together with video and component DOIs, which were generated on the fly. The journal runs on Drupal/Jcore platform provided on HighWire Press."

The team also completed the full-service production of McGraw-Hill's premier medical reference text, Harrison's Principles of Internal Medicine, 19th Edition. "This edition involved copyediting of more than 17,000 manuscript pages and complex composition of over 4,000 pages. Our team also took on proofreading and indexing of this massive tome, which covers 486 print-book chapters and 137 online chapters. There were also more than 2,200 figures and 1,200 tables in this book," notes Waseem Andrabi, senior director for global content services, adding that the project managers collaborated with more than 600 authors and six lead editors.

[Read more from Publishers Weekly, Digital Solutions in India 2015, special report.]


Posted
AuthorMarianne Calilhanna

The following content is excerpted from Publishers Weekly, Digital Solutions in India 2015: The 10th Annual Review. 

The first Publishers Weekly (PW) coverage of the India-based digital solutions industry, which was then widely known as “content services,” coincided with the launch of Twitter and Google’s acquisition of a 22-month-old startup called Android Inc. The iPhone and iPad were, respectively, one and four years away from being launched. Social media didn’t exist, phones were mobile but not yet smart, and life was just fine.

The year was 2006, and the report, titled "Content Services and Printing in India," focused on print- v. content-centric workflows, with conversations revolving around XML, PDF, and e-deliverables. Also included was a “Know the Lingo” sidebar—on SGML, XML, front-end XML, DTD, batch publishing, 3B2, TeX, and LaTeX—to explain the acronyms and new workflows, for the benefit of those who were about to embark on the content digitization path.

The fact that XML—which was introduced 20 years ago, in 1996—was a focus 10 years ago, and has since become nearly ubiquitous, drives home the point that new technology often comes fast and furious, while adoption tends to be slow and sporadic. Costs of shifting to a new technology or workflow aside, change is truly scary for those operating in the legacy print-centric publishing world. In the case of XML, it really is a necessity for ensuring content neutrality, reusability, and multiplicity, while preventing content obsolescence. In short, XML is required for content longevity and healthy bottom lines (or even survival) for publishers. And that has been PW’s main message right from the start of its coverage.

Since 2006, the conversations have been expanded to cover content mobility (with e-books, e-learning, and mobile apps), cloud technologies, accessibility, Big Data, and discoverability—thus reflecting the tremendous shifts and transformation in digital solutions services, publishing models, and consumer demands. The industry is now focused on intuitive and dynamic workflows, interactive and integrated media, scalable and customized solutions, aggregated and dechunked data, single-source and multipronged processes, and agile and mobile technologies.

PW asked nine vendors to share their thoughts about the industry back in 2006, the changes they’ve seen since then, and what lies ahead.

10 Years Ago...

I thought print would be obsolete and XML-first workflows would be a done deal for all publishers.... TODAY, publishers who hesitated with XML workflows might find that HTML5 provides the type of interoperability required for content distribution. Or it might not.... There is no simple, single answer for all publishers. But what we continue to observe is that print drives digital, and digital drives print. And, no matter where the markup language is implemented in a publisher’s workflow, it is definitely implemented. —Marianne Calilhanna/Cenveo Publisher Services

The following graphic captures just some of the transformations we've observed that impact our publishers' business. What technology and content transformations would you add to the list?


Posted
AuthorMarianne Calilhanna

The following content is excerpted from Publishers Weekly, Digital Solutions in India 2015: The 10th Annual Review. 

By Evan Owens, Vice President Publishing Technologies, Cenveo Publisher Services

In any standards landscape, there is diversity and not likely to be a single or permanent solution. Because content often has to connect to bibliographic and business metadata, integration with other standards can be a requirement and a challenge.

Creating industry standards that provide real-world value is a collaborative effort involving experts and leaders from a broad cross-section of specialties.
— James Bryce, British academic, jurist, historian, politician

Over time, standards will evolve and new standards will appear adding to the diversity of the landscape. During the past two decades, there has been a move away from proprietary publisher-specific SGML and XML DTDs and toward the adoption of industry standards. But as those industry standards evolve, adoption of new versions can take time; for example, in journals publishing, the NLM DTD was used for a long time after its successor the JATS DTD was released because journal hosting platform vendors had not made the switch to the new standard.

Content markup standards tend to be very rich vocabularies that support a wide range of functions beyond just capturing the text. Thirty years ago, rendition of the text was the core function of content markup, which was the digital equivalent of typesetting and support of the print world. The next generation of content markup moved the focus to structure (sections, components, etc.) and rendition based on that structure. That change facilitated the management of metadata within the text.

With the emergence of the Internet, content markup standards expanded to include internal and external relationships—features well beyond the world of print pages. Recently, content markup has become even more complex with support of semantic enrichment: markup of entities and concepts and pointing to external intellectual resources such as linked data.

Throughout the history of content markup standards, there have been key design challenges that have had a variety of solutions. These challenges have come and gone, and reappeared over the decades of markup solutions.

One of the earliest design issues was generated text and automated styling: does the file contain every bit of text visible to the reader or is some of it generated by styling and rendition rules? And if so, how is that generated text managed over time and archived?

Another challenge has been managing metadata: is it present in the reader-visible text or is it additional information to be embedded invisibly in the file as elements or attributes? A classic issue in structure markup from the early days of SGML has been whether the markup is presentational or semantic meaning.

A newer trend in content markup has been in response to internationalization: support for textual alternatives such as translations and transliterations, which is robustly implemented in both JATS and ePub3. As content markup moved beyond just creating printed pages to managing collections of text and media, packaging has become a key issue. The ePub standards include the most robust and elegant solution to date.

In the history of content markup languages, the most significant change has been the move from rendition to structure. TeX was a rich typesetting language; LaTeX was a set of macros built on TeX that associated formatting with structure. SGML and XML were focused on structure but rendition elements were present in varying degrees. HTML was both presentation and structure along with dynamic reflowable rendition. EPub supported reflowable rendition but now ePub3 also supports fixed layout, so the pendulum can swing both ways as standards evolve.

HTML5 is a big change from early HTML: it is now more “structurally semantic” and presentation has been moved to the CSS vocabulary. So it is much more like a document DTD in the XML tradition with rich structural markup. An amusing example of this profound change is the group of elements that in HTML4 were called “font style elements” and now in HTML5 are called “text level semantics”. The “” element definition changed from “renders as italic text style” to “represents a span of text in an alternate voice or mood.” This is a recent instance of a key markup design choice that has been on the table since the earliest days of SGML.

Content markup standards are inherently complex so there can be a wide variety in how they are used. Standardization of usage can be at various levels: within a single work, a collection, a publisher, or a genre. Usage can also change over time. These practices are defined as “usages profiles” and include how one standard invokes another. For example, ePub uses a specific profile of HTML5. Meanwhile, JATS4R (JATS for Reuse) is a newly developing profile defining a subset of the JATS standard.

When designing content management solutions using content markup standards, there are many key issues. The highest-level choice is whether to use a delivery format or a master format from which multiple renditions can be generated; the new rich design of HTML5 means that it could potentially be both. Reusability of content, present and future, is another key solution requirement. As standards will inevitably evolve over time, version control of both content and specification/standardization is critical to protecting the future use of the content.

Metadata management is a long-standing design challenge: is it embedded or extracted from the content or managed externally? A robust workflow solution has to manage not only the structured markup but also the content itself; e.g., editorial preferences such as section headings. Controlling content styling is not a functionality of markup languages but can be managed with pre-edit and validation tools that connect the markup structure with editorial guidelines.

The NISO JATS standard is the most-used standard in the journal publishing space. It is a massive content markup vocabulary with rich markup for front matter and citations. Because it is complex, usage is wildly variable across the community. EPub3, described as a “distribution and interchange format”, is a great set of components that goes beyond just content markup: package, metadata, rendition, and semantic support such as RDFa. Because of the change of HTML5 toward structure markup, the difference between JATS and ePub has diminished. 

In the past, the differences between the books and journals communities were significant. Books had strong metadata needs driven early on by the book-selling process; journals later moved to significant metadata exchange. As e-book readers were industry tools, book publishers had to support industry standard applications. Online journal publishers focused on HTML and PDF on hosting platforms. In the last decade, the differences have diminished as the rise of mobile reading devices as reader platforms has impacted both books and journals. Some journal publishers started using ePub to deliver articles or entire issues.

As the standards landscape continues to evolve, long-term success will depend on leveraging community expertise and implementing standards using robust content management practices.


Posted
AuthorMarianne Calilhanna

The following content is excerpted from Publishers Weekly, Digital Solutions in India 2015: The 10th Annual Review. 

Ten years ago, it was all about content conversion and e-deliverables. Today, the digital content proposition is no longer as simple. 

Now content needs to be curated, customized, converged, and cloud-based. Publishers and content creators/aggregators big and small are racing to achieve faster times to market; to develop a better understanding of their consumers; to serve ever more customized, agile, mobile, and intuitive content; and to monetize engaged readers.

Balancing cost and quality has always been a major challenge for any publishing or content company. Publishers Weekly (PW) interviewed select vendors—a small sampling out of hundreds of digital solutions companies thriving in India—to understand how automation, innovation, and persistence are the key to their competitiveness (and survival). 

This review—which remains as unscientific as it was 10 years ago—is totally discriminating in that it features only vendors that appear on PW's radar, and caught their eyes based on novel solutions, unique workflows, and interesting propositions on dealing with the constantly changing content, consumer demands, and technologies. 

Opportunities & Challenges

Waseem Andrabi, senior director for global content services, definitely sees adaptive learning continuing to gain traction throughout 2015, with learning opportunities pushing into new frontiers including virtual reality. More publishers, he says, will consider continuous publication models rather than packaging products and publishing periodically. “The rise of smartphones and tablets as the primary user interface—instead of PCs or laptops—means that content has to be structured for seamless data interchange across media and devices. Such ‘transformative publishing’ requires publishers to constantly evolve and transform workflows to remain relevant. The content itself needs to transform as well.”

Marianne Calilhanna, Cenveo’s marketing director, believes tools for researchers and authors are becoming increasingly important. “Today’s researchers and authors are computer-savvy, and they expect to have access to tools that they can use. So when we offered publishers Smart Proof—an online proofing and correction tool that is a part of the Cenveo Publisher Suite—the reception was overwhelmingly positive. At the end of the day, authors simply want to communicate their ideas, and they want to be in control of that communication.”

The Cenveo Publisher Suite, adds Calilhanna, is developed with the core objective of delivering quality content as fast as possible. “What is important to our publishers is editorial integrity, and fast delivery of structured, high-quality content. We push publishing back into the hands of those that matter most—the editors and authors—and with that, the black hole of vendor processing becomes transparent, consistent, predictable, and nonthreatening. Through tools such as Smart Edit and Smart Proof, we put the power in the authors’ hands, and we provide publishers with the confidence that our tools will manage the content structure while the authors manage the content.” The Cenveo Publisher Suite has been proven to reduce turnaround time by up to four days, reduce errors during proofing, and allow for creation of new products and deliverables on the fly within one to three days.

Meanwhile, Cenveo’s Digital Content group is busy designing learning products to help the 21st-century learner succeed. “Online courseware, quizzes, learning apps, games, simulations, you name it. Their quest is to bring learning opportunities to where the learners are. Hence the group’s mantra, ‘design once, deploy everywhere.’ All our e-learning modules are packaged in contemporary interfaces that are smooth and easy to navigate,” explains Waseem Andrabi, senior director for global content services.

As for growth segments, Calilhanna sees “the opportunities in digital fulfillment and content-management archiving, and cloud-based software as a service. Complex project management and custom-built architecture for publishers that cannot afford to hire such resources in-house is also big. These are the areas that Cenveo is working on while continuing to improve our turnaround time to publish quality content with a faster time to market.”

[click here to read the full article]


Posted
AuthorMarianne Calilhanna