Thumbnail: woodpecker-inverted

Thirty Hours of Screening

Summary #2 on
3 minute read

Zhelnov P. Thirty hours of screening. Zheln. 2020 Sep 27;39(2):s2e3. URI: https://zheln.com/summary/2020/09/27/2/.

Highlights

  • Three records were processed in 30 hours instead of 1,140 planned
  • Nevertheless, I argue it’s normal as technology & methods are not production-ready
  • The next week’s focus will be preserving quality while smoothly increasing quantity
  • Bonus: Everyone can engage in Zheln appraisals from now on!

So What’s New?

  • In the previous summary, I came to the conclusion all is ready for record screening and envisaged to process 570 records per day.
  • It is easy to calculate I should’ve processed 1,140 records in these two days that passed since the previous summary.
  • So how many records have I screened in these 30 hours?
  • Well, just three records, actually (one, two, three).

Why So Few?

  • Personally, it feels quite natural.
  • First of all, when I set off appraising records, it quickly became evident the editable versions were deficient: I needed to change the record footers manually.
  • Also, manual selection of the record appraisal status turned out to be duplicate.
  • So I needed to fix the record-maker script first.
  • After I’ve fixed the script, the next thing I noticed was that the specialty tagging methodology didn’t feel transparent enough; so I needed to elaborate on these methods first.
  • Indeed, it goes without saying that all the above-mentioned changes necessitated corresponding website modifications, … and so on.
  • Overall, it feels like the principal reason why, in almost a month of the Zheln’s uptime, just five records entered appraisal and none finished it is as follows:

    Zheln’s methods and technology are still not production-ready after a month of development, which is perfectly normal for a systematic review such as this one, especially given that it’s massive, conducted by a single researcher, and crowdfunded only.

  • ‘Smaller in number are we, but larger in mind.’
  • Forcing production now would amount to suicide.

What’s Next?

  • Anyway, speeding up is also critical for Zheln.
  • Honestly, with 500 records added to the queue daily, anything less than a conveyor belt will fail.
  • So the target of 570 appraisals per day remains, whereas the next week’s focus will be preserving quality while also smoothly increasing quantity.
  • As the record debt starts to go down, I will adjust this target accordingly to account for the records missed during the current high-intensity but low-volume period.
  • As to the other directions of Zheln development, namely increasing appraisal completion and preparing an academic manuscript, they will happily wait until proper record processing is settled.

Bonus

  • As one of the most recent updates, I started sharing editable versions of Zheln records.
  • Why is it so exciting? 😊
  • Using these files, other people could engage in record appraisal on Zheln! That means commenting on, replicating, or editing the appraisals.
  • To do any of that, you’d need to create a GitHub account first. Immediately afterwards, you’ll be able to fork a copy of the Zheln Methods repository into your own account.
  • There, choose an editable record you’d like to contribute to among those you forked and edit it.
  • Finally, create a pull request with your proposed changes. Just after you’ve done it, publicly discussing your pull request in the comments will be possible.
  • On GitHub, many really exciting things to do with the records are available (like comparing various versions character-by-character, and many more).
  • This is not the simplest solution to collaboration, I admit it. But whatever it is, it works. I hope more collaboration is coming to us in the future of Zheln.

See You Around, Peer

Get in Touch Now or Get Back on Wed, Sep 30

Email, Twitter, Instagram, Facebook, or Telegram

 Search for Appraisals,  Browse by AMA Specialty, or  Read Biweekly Appraisal Summaries