Open Review Toolkit: increased impact

Screenshot 2018-07-30 09.21.17.pngThe Open Review Toolkit is designed to lead to better books, higher sales, and increased access to knowledge.  In addition to these three goals, the Open Review Toolkit also likely leads to increased impact.  This claim is support both by my own experience—the citations to Bit by Bit: Social Research in the Digital Age have exceeded my expectations—and a recently white paper published by Christina Emery, Mithu Lucraft, Agata Morka, and Ros Pyne.  Emery and colleagues compare three impact metrics—citations, downloads, and online mentions—for about 200 open access books and 17,000 non-open access book.  They find that open access books have 50% more citations, 7x more downloads, and 10x more online mentions.  Their white paper, “The OA Effect: How does Open Access Affect the Usage of Scholarly Books?” explains their methodology and supplements their quantitative with interviews with authors and funders.

 

The Open Review of Bit by Bit, Part 3: Increased access to knowledge

open_review_session_map.png

This post is the third post in a three part series about the Open Review of Bit by Bit: Social Research in the Digital Age.  This post describes how Open Review led to increased access to knowledge.  In particular, I’ll provide information about the general readership patterns, and I’ll specifically focus on readership of the versions of the manuscript that were machine translated into more than 100 languages.  The other posts in this series describe how Open Review led to a better book and higher sales.

Continue reading

The Open Review of Bit by Bit, part 2: Higher sales

amazon_number1.png
This post is the second in a three part series about the Open Review of Bit by Bit: Social Research in the Digital Age.  This post describes how Open Review led to higher sales.  The other posts in this series describe how Open Review led to a better book and increased access to knowledge.

Before talking about sales in more detail, I think I should start by acknowledging that it is a bit unusual for authors to talk about this stuff.  But sales are an important part of the Open Review process because of one simple and inescapable fact: publishers need revenue. My editor is amazing, and she’s spent a lot of time making Bit by Bit better, as have her colleagues that do production and design.  These people need to be paid salaries, and those salaries have come from somewhere. If you want to work with a publisher—even a non-profit publisher—then you have to be sensitive to the fact that they need revenue to be sustainable.  Fortunately, in addition to better books and increased access to knowledge, Open Review also helps sell books. So for the rest of this post, I’m going to provide a purely economic assessment of the Open Review process.

Continue reading

The Open Review of Bit by Bit, Part 1: Better books

bit-by-bit-cover-934a0a3f

My new book Bit by Bit: Social Research in the Digital Age is for social scientists who want to do more data science, data scientists who want to do more social science, and anyone interesting in the combination of these two fields.  The central premise of Bit by Bit is that the digital age creates new opportunities for social research.   As I was writing Bit by Bit, I also began thinking about how the digital age creates new opportunities for academic authors and publishers.  The more I thought about it, the more it seemed that we could publish academic books in a more modern way by adopting some of the same techniques that I was writing about.  I knew that I wanted Bit by Bit to be published in this new way, so I created a process called Open Review that has three goals: better books, higher sales, and increased access to knowledge.  Then, much as doctors used to test new vaccines on themselves, I tested Open Review on my own book.

This post is the first in a three part series about the Open Review of Bit by Bit.  This post describes how Open Review led to a better book.  After I explain the mechanics of Open Review, I’ll focus on three ways that Open Review led to a better book: annotations, implicit feedback, and psychological effects.  The other posts in this series describe how Open Review led to higher sales and increased access to knowledge.

Continue reading

Invisibilia, the Fragile Families Challenge, and Bit by Bit

bitbybit_invisibilia_ffc

This week’s episode of Invisibilia featured my research on the Fragile Families Challenge.  The Challenge is a scientific mass collaboration that combines predictive modeling, causal inference, and in-depth interviews to yield insights that can improve the lives of disadvantaged children in the United States. Like many research projects, the Fragile Families Challenge emerged from a complex mix of inspirations.  But, for me personally, a big part of the Fragile Families Challenge grew out of writing my new book Bit by Bit: Social Research in the Digital Age.  In this post, I’ll describe how Bit by Bit helped give birth to the Fragile Families Challenge.


Bit by Bit is about social research in the age of big data.  It is for social scientists who want to do more data science, data scientists who want to do more social science, and anyone interested in the combination of these two fields.  Rather than being organized around specific data sources or machine learning methods, Bit by Bit progresses through four broad research designs: observing behavior, asking questions, running experiments, and creating mass collaboration. Each of these approaches requires a different relationship between researchers and participants, and each enables us to learn different things.

As I was working on Bit by Bit, many people seemed genuinely excited about most of the book . . . except the chapter on mass collaboration. When I talked about this chapter with colleagues and friends, I was often greeted with skepticism (or worse).  Many of them felt that mass collaboration simply had no place in social research. In fact, at my book manuscript workshop—which was made up of people that I deeply respected—the general consensus seemed to be that I should drop this chapter from Bit by Bit.  But I felt strongly that it should be included, in part because it enabled researchers to do new and different kinds of things.  The more time I spent defending the idea of mass collaboration for social research, the more I became convinced that it was really interesting, important, and exciting.  So, once I finished up the manuscript for Bit by Bit, I set my sights on designing the mass collaboration that became the Fragile Families Challenge.

The Fragile Families Challenge, described in more detail at the project website and blog, should be seen as part of the larger landscape of mass collaboration research.  Perhaps the most well known example of a mass collaboration solving a big intellectual problem is Wikipedia, where a mass collaboration of volunteers created a fantastic encyclopedia that is available to everyone.

Collaboration in research is nothing new, of course. What is new, however, is that the digital age enables collaboration with a much larger and more diverse set of people: the billions of people around the world with Internet access. I expect that these new mass collaborations will yield amazing results not just because of the number of people involved but also because of their diverse skills and perspectives. How can we incorporate everyone with an Internet connection into our research process? What could you do with 100 research assistants? What about 100,000 skilled collaborators?

As I write in Bit by Bit, I think it is helpful to roughly distinguish between three types of mass collaboration projects: human computation, open call, and distributed data collection.  Human computation projects are ideally suited for easy-task-big-scale problems, such as labeling a million images. These are projects that in the past might have been performed by undergraduate research assistants. Contributions to human computation projects don’t require specialized skills, and the final output is typically an average of all of the contributions. A classic example of a human computation project is Galaxy Zoo, where a hundred thousand volunteers helped astronomers classify a million galaxies. Open call projects, on the other hand, are more suited for problems where you are looking for novel answers to clearly formulated questions. In the past, these are projects that might have involved asking colleagues. Contributions to open call projects come from people who may have specialized skills, and the final output is usually the best contribution. A classic example of an open call is the Netflix Prize, where thousands of scientists and hackers worked to develop new algorithms to predict customers’ ratings of movies. Finally, distributed data collection projects are ideally suited for large-scale data collection. These are projects that in the past might have been performed by undergraduate research assistants or survey research companies. Contributions to distributed data collection projects typically come from people who have access to locations that researchers do not, and the final product is a simple collection of the contributions. A classic example of a distributed data collection is eBird, in which hundreds of thousands of volunteers contribute reports about birds they see.

bitbybit5-1_mass_collaboration_schematic

 

Given this way of organizing things, you can think of the Fragile Families Challenge as an open call project, and when designing the Challenge, I draw inspiration from the other open call projects that I wrote about such as the Netflix Prize, Foldit, and Peer-to-Patent.

If you’d like to learn more about how mass collaboration can be used in social research, I’d recommend reading Chapter 5 of Bit by Bit or watching this talk I gave at Stanford in the Human-Computer Interaction Seminar.  If you’d like to learn more about the Fragile Families Challenge, which is still ongoing, I’d recommend our project website and blog.  Finally, if you are interested in social research in the age of big data, I’d recommend reading all of Bit by Bit: Social Research in the Digital Age.