PleaseTech blog

We aim to provide useful, pertinent and sometimes fun insights into the world of document collaboration and the workings of a technology company

Introducing PleaseReview v5.1

Posted by David Cornwell on 2. April 2014 09:34

Founder/CEO of PleaseTech Ltd - collaborative document review and co-authoring for the enterprise.


April already. They say that ‘time flies when you're having fun’. I can tell you that it certainly flies when you are trying to get a software release out.

The testing of PleaseReview v5.1 is now well under way and, assuming no major issues are identified, we expect a release date towards the end of May. We never release until we are sure it’s a quality product and our testing is complete. There is testing of the new functionality, regression testing and installation and upgrade testing and, of course, all documentation and other support material to prepare. There are so many elements in the mix.

So what will PleaseReview v5.1 contain? Well, as always, the thinking behind the release is to:

·         Continue the ‘beyond review’ strategy;

·         Facilitate enterprise rollout with enterprise enhancements;

·         Address new client requirements;

·         Keep up with the changing environment.

The ‘beyond review’ strategy’s intention is to consolidate our thought and technology leadership by adding value to the review process.

So, what does PleaseReview v5.1 include?

Sub-review and parallel reviews

One of the new features in v5.1 will be the concept of a sub-review. This will allow a review participant to create a sub-review, review the document(s) with their own chosen review participants and then publish selected comments and changes from the sub-review to the master review. To explain:

Imagine I was a department head and was invited to review a corporate policy or procedure which affected my department. I would want to first discuss this with my management team to get their feedback and consolidate their comments. Sub-reviews are designed for exactly this scenario. We can have an ‘internal’ departmental review and then publish our consolidated feedback to the master review without having to ‘wash our dirty linen in public’.

Parallel reviews are somewhat different. This would be appropriate if you wanted to gather feedback from two entirely separate groups at the same time without each of the other groups being aware of the other’s existence.

This increases the ‘workflows’ available in PleaseReview so with v5.1 there will be, out-of-the-box:

1.       Standard single stage collaborative review

2.       Sequential reviews where each stage can comprise one or many participants;

3.       Sub-reviews;

4.       Parallel reviews

Combinations are possible. For example, if permitted, it will possible for a participant in a stage of a sequential review to create a sub-review. This allows for very sophisticated review management.

Context-based review

Most people see review as an ability to comment upon and mark-up a document. And, whilst that is correct, there are many ways to look at the document. You can sit down and read it as you would a book. You can follow all the cross references and therefore jump backwards and forwards in the document. Or you can ask the question: ‘What does the document say about nnnnnn’?

It is this latter approach which context-based review is designed to support. Reviewers are able to search the document for a phrase or text string and PleaseReview will produce a report with all instances of the phrase (or text string)  presented in context. This is much like a Concordance, whereby a list of the material words used in a work are presented together with their immediate context as a separate index.

Providing the context in the report is important. This allows the reviewer to rapidly scan the report and examine the document for consistency. It’s more than just checking for spelling or capitalization. It allows the reviewer to check that a phrase or term is used in a consistent way throughout the document.

These subtle requirements come from being involved in endless discussions in respect of document review and from listening to people struggling with these issues. By listening to our target audience and then incorporating their requests and requirements into our product plans, PleaseReview continues to set the standard for document review.

Post review reporting

Post review reporting will further extend PleaseReview's ability to deliver metrics around the review. Whilst with the current release a set of comprehensive review metrics is already available, these are mainly delivered at a document or system level. For example, how many reviewers made how many comments and what percentage were accepted or rejected, etc.

The post review reporting available with v5.1 will allow companies to drill down deeper within the document itself.  So, for example, let’s examine section 3 of the document. How many accepted proposed changes were there? In section 4 of the document how many rejected proposed changes were there?

This requirement has been driven by one of our clients who is looking to use review metrics to analyse the quality of writing and reviewing. By examining the number of accepted and rejected changes on different sections of the document, some initial determination can be made of the quality of the author and/or the reviewer. At the very least, flags can be raised as to which areas merit further investigation.

This illustrates that reporting can bring real value by helping to control and measure the review process. In all honesty, it's not the primary reason customers turn to PleaseReview but is simply a welcome side benefit.

Enhanced configurability

As PleaseReview gets rolled out across large organizations, the requirements of many thousands of users need to be addressed. For example, the comment categories which may be appropriate for an engineering document will not be appropriate for those in marketing and will be different again to those required when reviewing a proposal.

We could have done the basic minimum but we took the plunge and have implemented a full hierarchical inheritance model. What this means in English is that we will deliver a highly configurable system which retains central control with the absolute minimum work required. It is possible to specify the behavior of the system with respect to a specific department and/or review type (for review types see below) and have one override another. So, for example, if a review type permits the download of the original document and the departmental settings do not, the departmental settings will override the review type. We believe that this level of configuration will serve to meet the requirements of large enterprises going forward.

Cost Center licensing

An aligned but separate requirement is that of cost center licensing.

A large company may have a single installation of PleaseReview, but licenses are purchased from individual departmental budgets. These departments may take a dim view of another department using licenses purchased from their budget.

Cost center licensing will allow groups of licenses to be ring fenced for an individual department’s use.  Once again this facilitates enterprise deployment and, hopefully, keeps peace and harmony in the corporation.

Review types

Facilitated by the enhanced configurability, review types take PleaseReview's current templating capability to a whole new level.

Now standard review types can be set up which specify all review parameters including, potentially, the duration of the review, the review participants (via standard distribution lists) and a host of other configuration parameters.

So, for example, as someone who has just written a blog entry and wants it reviewed, there could be a standard review type called ‘Blog entry’. So all I have to do is upload the document, select the review type and the review will be started for a set duration to a standard set of people included on the ‘blog review’ distribution list. If, in the future I need to change who reviewed the blog entry, I wouldn’t change the review type I’d simply amend the distribution list associated with the review type 'Blog entry'.

Of course, for larger companies, it would be possible to have subsets, such as engineering blog, marketing blog, etc.  

This is especially powerful when coupled with standard workflow systems such as those found in a document management system (DMS). The DMS user simply initiates a pre-set workflow which in turn calls the PleaseReview review type. In this way a sophisticated, integrated system requires very little work.

Archive 

Finally, we will be offering an optional archive module as a cost option.  When we initially conceived PleaseReview we saw reviews as transient instances which, when the document was approved, would be discarded. The approved document would be the true electronic record and how it got there was immaterial.

However, as we have started branching out into new market sectors and people are placing a greater importance on due diligence, compliance and being able to prove that company procedures were observed, several clients have requested the ability to archive review data.

The archiving module will meet the needs of these clients by ensuring that the review data is securely archived prior to a review being remvoed from the system.

Other stuff

Additionally there will be support for new environments. As a company offering on premise solutions (in addition to the cloud), we operate in a complex, ever changing environment as other companies upgrade their offerings.

Needless to say, with a release of this magnitude there are always minor enhancements and bug fixes included. These are too numerous to mention.

The only constant is change.

I’m confident that with PleaseReview v5.1 we maintain the high standards PleaseReview has been setting for years and that PleaseReview will continue to lead the market with respect to document review.

The evolution of testing

Posted by Ashley Harrison on 11. March 2014 11:11

Senior test analyst for PleaseTech


The test team here at PleaseTech are at full speed ahead. This is currently one of my more exciting times as a tester as the next release of PleaseReview, our collaborative review solution, looms on the horizon and a host of new functionality and enhancements start to roll in. Getting to strip down a specification for new functionality where new ideas and possibly new technology are being implemented, analysing and identifying areas of risk, prioritising risk and ultimately identifying test case criteria are what gets the blood of a tester flowing - what other job pays you to break things!?

At the beginning of every release cycle for PleaseReview I sit down and look at what is coming, and establish a plan of attack – and then the murmur of automation creeps into my mind. Automation is on the mind of every test team I have been a part of, whether it was only a consideration or was being actively worked on. As a relatively juvenile profession, the core of a test team’s work is on a predominately manual basis. Automation is the evolution of testing.

When you sit down and think about it, automation initially appears a no brainer. The brilliant thing about automation is the flexibility it provides, for example:

-     It can be added to the overnight build script which then provides you with a log of results, which are waiting for you on your arrival in the morning and highlight any potential issues

      It can be used to lighten the load of regression testing allowing manual focus to be intensified on high risk areas;

      It can even (subject to software and configuration) identify areas of code change and call on previous automation test cases that ran over that specific area of code, giving you a heads up on potential issues before you have even had the chance to   look at the work item.

However, automation is not answer to everything… Certain software and testing activities lend themselves to automation but many don’t, especially in the area of document review.

For example, it’s one thing to automatically test the completion and submissions of an HTML form, it’s another to select some text in a document and edit it to create a proposed change.  If you think about it, the test is going to work for that precise document and that precise edit. However, we can’t control what documents clients put into PleaseReview, which bits they edit and what they put in that edit. In reality, edits are frequently copied and pasted from other documents. In fact, the Word documents are frequently large, complex documents which make full use of Word’s cross referencing, field codes, styles, and so on.

So, whilst there are areas of the testing we can automate some areas will have to continue to be manual.

There is also the fact that the initial implementation of an automated suite of tests is incredibly labour intensive, as is the maintenance. Before you even get to the stage of writing test cases you must establish which software fits best and what technology you are going to use. Once that has been decided on you can get to grips with creating an automation suite.

Creating an automation suite is, in itself, a software project. It needs to be designed, developed and tested, and that’s a challenge I’m up for.

Ultimately the quality of a released product lies with me. So automation is a must have in my point of view. We pride ourselves in the quality of our product, and to maintain the high standards that we have set ourselves, I plan to have automation up and running in the near future. The initial analysis of automation implementation suggests that it’s not going to be easy, but who likes easy?

Watch this space and I’ll let you know how I get on.

What would you do with an extra 14 hours of spare time a week...?

Posted by Sarah Edmonds on 3. March 2014 12:01

The other half of marketing... Google


With 46% of US employees reporting that their workload is the main contributing factor towards workplace stress[1], what would you do if someone gave you an extra 14 hours a week of spare time?  Reduce your daily stress by not running from one proposal, regulatory submission or bid to the next?  Allow yourself the time to improve the quality of your work, instead of rushing to get the job done?  Give yourself a better work life balance by leaving on time to get home to your kids, go to the gym or meet friends?

Sound like a bit of dream?  Well 14 hours a week of extra time is exactly what PleaseReview has given one of our customers; a medical writer working for a biopharmaceutical company, who is responsible for the production of documentation for five major drug programmes.  Now that’s a lot of documents to write and review…

You see before they started using PleaseReview they ‘reviewed these documents in a vacuum’ without being able to see anyone else’s comments or edits, and with 20 participants involved in the process, incorporating comments and managing conflicting edits had become a real issue. 

And this outcome isn’t unique.  Ongoing evidence demonstrates that our products deliver tangible time savings – clients report savings of 50% and more in document preparation times using our solutions to improve their co-authoring and review processes.

So if this is you…

 

 

Next time remind yourself there is a better way….


[1] According to the American Institute of Stress

 

Using PleaseReview to make the impossible possible

Posted by Jason Webb on 25. February 2014 14:14

Integrations Manager


When PleaseReview was originally designed we had a vision of making document review a simpler and less painful process for those involved. The initial concept was of small tight-knit groups of 6-15 people reviewing a small number of documents, but thankfully we engineered it without technical limits and as PleaseReview has grown as a product, so have our end user requirements.
 
I was recently working with a life sciences company, and was slightly surprised to find them routinely undertaking reviews containing up to 170 documents and with 100+ reviewers. To put this in perspective, if each document was printed out for each person to work on this would result in a stack of paper roughly 590 feet or 180 meters high: twice the height of the Statue of Liberty and half as tall as the Eiffel Tower!
 
 
This volume of data is impossible to review using almost any other mechanism due to both the number of reviewers and the number of documents (and associated comments and changes) involved: the 'traditional'  email and Word track changes functionality would be chaos and a PDF based approach would require massive effort to consolidate the comments and changes and provide the associated reports. In real time applications, such as Google Docs or Office 365 (quite apart from the issues this would create by hosting high value intellectual property documents), the anarchy caused by hundreds of users making uncontrolled edits to hundreds of documents doesn't even bear thinking about!
 
The only possible way to review such large document sets is using PleaseReview. 
 
So when I asked one of the business users if she thought PleaseReview was important to her job,  she immediately responded with "No, it's CRITICAL to our process now".
 
Now, that's job satisfaction!
 
 
If you would like to know more how our customers use PleaseReview, please take a look at some of our case studies.
 

Cloud File Sharing And The Art of Document Collaboration

Posted by PleaseTech Guest on 6. February 2014 16:01

Our guest blogger is...


Andrew Barnes, independent marketing consultant

My first experience of cloud based file sharing was as a personal user of Dropbox several years ago. As the owner of a shiny new iPad I needed to be able to transfer content between the device and my main laptop. Dropbox proved ideal.

My second encounter with cloud file sharing was not so successful. I was doing some work for a company that used a third-party cloud service as a file-sharing mechanism. I needed access to this repository, so they gave me login credentials.

I just downloaded the software to my laptop, logged in and started to access the content I needed. No training was required and everything was going well until my Macbook Air ran out of local disk space, and I wasn’t connected to the internet.

No problem I thought, I could delete the several GB of shared files that had been synched to my computer and all would be well.

As someone who has been in and around enterprise computing for the last 30 years I was astonished at what happened next. I got back on line and was blissfully unaware anything was wrong until various users started complaining that the “shared disk” was empty. It seemed my action of freeing disk space locally had synched to the cloud. No real harm was done but it was a lesson for the system administrator to think about the security and risk implications of cloud file sharing.

Fast forward a couple of years and I’m reminded again of risk, the importance of an audit trail, and the opportunity that can be gained by knowing who did what to documents.

There is no doubt that the facility for real-time editing of documents using applications like Google Docs can be very useful when used in small measure. There is equally no doubt that it can be a nightmare.

Allowing multiple people to “real-time collaborate” on a document will get very confusing. Keeping track of who did what, and having considered discussion of, and input into, the merits of a change can make for interesting conversations; especially if all the collaborators can’t be on-line at once.

To me such editing quickly has the same effect as many people talking at once. The dominant people (not necessarily the subject experts) take over, changes are made that may not be well considered and there may be little or no audit trail available. What’s worse is that if you are off-line you are out of it.

For some, document collaboration is a regulatory requirement. For others it is a productivity and consistency cornerstone of business and sales processes. In either case a structured approach is the only way to be confident in what is produced.

And the structured approach can provide unexpected value. With the right platform tracking who has worked on which documents can help build the profile of subject matter experts. Identifying who made the most contribution to sales proposals can pin-point deal champions. Investigating the effectiveness of partners in the review process can cement relationships, or identify room for improvement.

So yes, cloud file sharing has its place, but before diving in on a large scale, think about your needs from the document collaboration process, what outputs you expect and how well you can demonstrate adherence to regulation.

header bg