We’ve just completed the 2019 Summer Institutes in Computational Social Science. The purpose of the Summer Institutes are to bring together graduate students, postdoctoral researchers, and beginning faculty interested in computational social science. The Summer Institutes are for both social scientists (broadly conceived) and data scientists (broadly conceived). In addition to the site at Princeton University, which was organized by Matthew Salganik and Chris Bail, there were also 11 partner locations run by alumni of the 2017 and 2018 Summer Institutes:
- Bamberg, Germany (University of Bamberg) organized by Julian Hohner, Thomas Saalfeld, and Carsten Schwemmer (SICSS 2018)
- Boston, MA (MIT) organized by Ryan J. Gallagher (SICSS 2018), David Hagmann (SICSS 2018), Eaman Jahani (SICSS 2018), Yan Leng (SICSS 2018), Sanaz Mobasseri (SICSS 2018).
- Cape Town, South Africa (University of Cape Town) organized by Visseho Adjiwanou (SICSS 2017)
- Chicago, IL (Northwestern University) organized by Kat Albrecht (SICSS 2017), Natalie Gallagher (SICSS 2018), Tina Law (SICSS 2018).
- Istanbul, Turkey (Kadir Has University) organized by Matti Nellimarka (SICSS 2017) and Akin Unver
- Los Angeles, CA (UCLA) organized by Alina Arseniev-Koehler (SICSS 2018), Pablo Geraldo, Bernard Koch, Friedolin Merhout (SICSS 2018), and Marcel Roman
- Monterrey, Mexico (Universidad Autónoma de Nuevo León) organized by Oscar Mendez (SICSS 2018) and Adaner Usmani (SICSS 2017)
- Oxford, United Kingdom (Oxford University) organized by Taylor Brown (SICSS 2017), Nicolo Cavalli (SICSS 2018), and Ridhi Kashyap (SICSS 2017)
- New York, NY (Hunter College-CUNY) organized by Maria Rodriguez (SICSS 2017), Gleneara Bates-Pappas (SICSS 2018), Sebastian Hoyos-Torres (SICSS 2019)
- Research Triangle Park, NC (RTI International) organized by Antje Kirchner (SICSS 2017), Craig A. Hill, Alan Blatecky, Helen Jang, Jacqueline Olich, and Chloe Stephenson
- Zürich, Switzerland (ETH Zürich) organized by Elliott Ash (SICSS 2017) and Elena Labzina (SICSS 2018)
The purpose of this blog post is to describe a) what we did, b) what we think worked well, and c) what we will do differently next time. We hope that this document will be useful to other people organizing similar Summer Institutes, as well as people who are organizing partner locations for the 2020 Summer Institutes in Computational Social Science. If you are interested in hosting a partner location of SICSS 2020 at your university, company, NGO, or governmental organization, please read our information for potential partner locations.
This post includes post-mortem reports from all of our locations in order to facilitate comparisons. As you will see, different sites did things differently, and think that this kind of customization was an important part of how we were successful.
Princeton University organized by Matthew Salganik and Chris Bail
We’ve divided this post into six main sections: 1) outreach and application process; 2) pre-arrival and onboarding; 3) A/V and room requirements; 4) first week; 5) second week; 6) post-departure.
1) Outreach and application process
We continue to think that the best way to have a great Summer Institute is to have great participants. As in previous years, we put a lot of effort into letting a big and diverse group know about the Summer Institute. Our major outreach effort began in January, once almost all of the partner locations had been finalized. We emailed former participants and former speakers. We also advertised through professional societies and asked our funders to help spread the word. Finally, we tried to reach potentially interested participants through social media, email lists, and emails to faculty that we thought might know interested participants. We were happy to learn that this year many participants heard about the Summer Institutes from a former participant.
We have used a different application system each year. In 2017, we had people email us their applications, which was functional but messy. In 2018, we used a google form that worked pretty well. In 2019, at the request of our funder, the Russell Sage Foundation, we switched to Fluxx. Fluxx created a number of issues. These issues are specific to someone using Fluxx so we will not list them here, but we have shared this feedback with the Russell Sage Foundation and we have updated the text on our application page to provide support for these issues. One challenge that has come up in all three years is the difficulty of accepting letters of reference. This always proves one of the most logistically difficult parts, and we would urge locations to consider these costs when contemplating requesting letters.
2) Pre-arrival and onboarding
The onboarding process was improved from last year, but it could have been better still. This year, we sent almost as many separate emails as we needed particular information from participants. We then had to compile responses and follow up with those who did not respond. An onboarding process that minimizes the number of communications and provides clear action-items with deadlines would simplify this process.
A preliminary form with an early deadline could be sent out shortly after acceptances are sent out to collect the following information (and any additional info organizers foresee being helpful):
– Information for website bios (with a link to the past website provided as an example).
- Name as you would like it to appear on the website
- Short bio for the website
- Personal website if participants would like us to link to that
- Photo upload for the website
– Other info
- preferred contact email
- T-shirt size, etc. (depending on the swag we plan to give out – laptop stickers might also be a nice and affordable swag item)
- Dietary preferences
- Coding experience (helpful for TAs and organizers to get a general sense)
- Any info regarding housing preferences (site specific)
This year was the first year that we used a travel agent. This encouraged people to book earlier and removed the need for participants to pay and be reimbursed. It was a bit complicated to set up, but overall it seemed to work well and I would encourage other sites that have this option to consider it. There were a number of questions about the reimbursement process, and we created an FAQ to help provide answers. The use of the travel agent also caused us to go over the credit limit on our university issued card. This was addressable with a credit request increase.
One way to make the in person experience run more smoothly is that we should provide participants with a list of all software/R packages that we will be using over the course of SICSS before they arrive as well as obtaining developer credentials for APIs used in some of the lectures and group exercises. We should expect them to have installed/updated all necessary materials prior to the start of SICSS in order to better guarantee that our code works for all participants. Participants who have difficulty with installing software/packages should reach out to SICSS TAs well ahead of arrival. This would also be a good reminder to them that they will need a computer. The day participants arrive, SICSS sites could offer a pre-SICSS technical support session in which participants can work with TAs to resolve any difficulties they may have had with software installation. This could be done after the welcome dinner, for example.
One thing that we included in the pre-arrival period this year was signing up to meet with guest speakers. More specifically, we circulated a Google spreadsheet for participants to sign up for meetings with external speakers three days before the start of SICSS. We switched away from the old system in which participants signed up for meetings the morning of speaker visits because earlier sign-ups gave us an opportunity to share participant bios with speakers. Multiple speakers requested this information and several reported finding it helpful for preparing for the visit. There were, however, a few challenges with the new system. Participant interest meetings far exceeded supply. Because the sign-ups were first come first served, participants were sometimes unable to meet with the speakers whose work most closely aligned with their own. A further challenge was that nearly all the 15-minute meeting slots were doubled up to accommodate demand, leaving less time for quality discussion. In future years, it may be a good idea to a) set a cap on the number of meetings participants can sign up for or b) solicit ranked preferences from participants and allocate meeting slots centrally. It may also help to extend meeting times slightly, perhaps to 20 minutes, although this may come at the cost of fewer meeting slots.
Speakers were sent overall itineraries for their visits about a week before their talks. A second version of the itineraries was sent 2-3 days in advance with meeting sign-ups and participant bios. As part of preparing for visits, it would be a good idea to ask speakers:
- Their video live streaming preferences. We offered three options: no video archive or livestream, livestream but no video archive, both livestream and video archive. We should also share the video release form used by the host institution.
- Their slide posting preferences. We offered three options: no posting, post to SICSS Slack only, post to Slack and GitHub
- Their talk title and/or slides whenever they are ready. Having talk titles in advance can help with matching participants with speaker meeting signup slots.
One final piece of logistics we could have done better in the pre-arrival is t-shirts. Even though it was better than last time, ordering and distributing the t-shirts still ended up being a bit rushed this year. Having the appropriate sizes for folks at all locations and having a vendor that can ship to all locations would help.
3) A/V and room requirements
Our first year at Princeton we were in a beautiful room that was wired for livestreaming. We didn’t realize how lucky we were. The A/V and room are inter-related issues, and we definitely could have done better this year. The picture above shows the A/V set-up for the room we ended up using. If your A/V set-up does not look like this, you might have trouble sending out high-quality livestream, which is especially important for some sites.
For the room, here’s what we should ask for in the future, split between must-haves and nice-to-haves:
- ability to have high-quality livestream out (more on that below)
- ability to easily project slides from a laptop or computer
Nice to haves:
- plenty of outlets for laptops (could be accomplished by extension cords and power strips)
- space for about 40 participants to sit in a U-shape
- flat floor (no bowls) with moveable furniture
- nearby breakout rooms/workspace
- natural light
- nearby space for catered lunch
- same room for the entire two weeks
What learned this year is that while you can livestream from any room, some rooms make it much easier. For the future, we want a room with:
- Projection recorded, live video switcher/audio mixer
- Room wired to record audio from participants or two or more hand held mics
- Video switching so that the video feed can be a mix between the camera and the slides
Also, we don’t just need livestream, we need livestream with buffering. For example, if folks start watching 1 hour after we start, they need to be able to go back to the beginning. This kind of thing is possible with YouTube Live and perhaps other services. It is not always possible with basic livestreaming. This full list of requirements needs to be clear to the A/V team far in advance. We tried to get set up for the first time on Monday morning only to discover that we were missing a cord. This lead to a series of patches, which caused the video quality on Monday to suffer. Doing a full test may add to costs, but it is worth it.
4) First week
The summer institute is split into two main chunks. In the first week, we have instructional lectures in the morning, group activities in the afternoon, and guest speakers before dinner. Overall, we think that the first week went pretty well.
A general theme over the years is that we are decreasing the amount that we teach and increasing the amount of time that we have for group activities. We think this is a positive development, but it means that we need to do a better job ensuring that groups work effectively and specifying the goals and deliverables for each group activity. This year we tried to do a better job randomizing groups (i.e., stratified sampling) to ensure that they had a mix of experience with the topic of activity and having a discussion at the end of the activity to draw out themes. Next year, we could do a better job specifying the deliverables of the activity, reminding the participants how to work together, and checking in with the groups to make sure that they are working well together.
Challenges continue to be teaching to participants with diverse backgrounds and interests, too much overlap with Bit by Bit, not enough time for the topics we cover, and not enough time for more topics. The three most requested additional topics were: machine learning, agent-based modeling, and social networks. We don’t see easy answers to address any of these challenges, so we just try to explain to participants why we are making the choices that we are making, and we could add a lunch discussion about how participants learn new skills
One positive development we have made in reducing the time crunch is to move administrative work elsewhere. For example, this year for the survey activity we created pre-funded MTurk accounts for participants to use. This saved the time that used to be spent putting money on the participants’ accounts. One area where we could improve is guiding participants through the application process for Twitter’s API credentials ahead of time. This allows us to guide people through the language requirements of the application while minimizing delays caused by waiting for applications to go through. Alternatively, we could schedule one of our coffee/lunch breaks to come immediately after applying for credentials for a more efficient use of time.
One last area where we can improve is involving the partner locations in questions to the speakers (both internal and external). This year we had a dedicated slack channel for questions and one of the TAs would read them (identifying the name and location of the questioner). One challenge was that we often had only one working portable microphone and that tended to be dominated by the local participants. Next year it would be better to be in a room that is wired to record all sound or have two or more microphones. Also, next year, we should have a daily keep-start-stop survey with the partner location organizers.
5) Second week
The second week is focused on participant-led group projects. Preparation for these projects began during the first week, when we told participants what was coming and this year in particular, we help clarify the range of kinds of projects that are possible: something that might become a publishable paper, creating a public good (e.g., new dataset or piece of software), or a project that would help the participant develop new skills. This was a marked improvement upon last year, where we encouraged groups to focus on tangible outputs by the end of the Summer Institute, which was not feasible for many groups.
With the appropriate background set in week one, our research speed dating—which took place on the first day of week 2—worked well. We had people list areas of interest. Then we formed into maximally similar groups and they had about 45 minutes to brainstorm projects. After that, we reconvened and each group reported back some of their project ideas. Next, we formed in maximally dissimilar groups and the process reported. At the end we had a spreadsheet with a list of project ideas and people join up onto those project right before lunch. Next year it might be helpful to have time for a question and answer period about the ideas in a big group. This would help clarify the ideas, potentially merge groups, and give everyone a bit of time to think.
There are a number of questions that came up in the second week about IRB and ethics. As organizers, we can provide feedback about ethics, but we cannot act as an IRB. It is also not clear what role the IRB of the host institution should play. In the future, it might be a good idea for the local organizer to contact the IRB of their host institution before the Summer Institute to help set ground rules and ensure that everyone has the same expectation.
We tried two new things with the participant-led group projects this year. First, we dramatically streamlined the process of getting micro-grants (hundreds of dollars) during the summer institute. This was a big improvement, and it allowed the groups to work quicker and with less stress about funding. Second, we create a sign-up sheet to reserve a time to meet with the organizers. This was a big improvement over the ad hoc system we used in the past. In addition to signing up to talk about the projects many participants also signed up to talk about other issues. We also noticed that some of the participants that signed up to meet with us tended to less likely to ask questions in other settings. We would definitely recommend this at other sites.
One other new thing that we tried in the second week was a multi-location panel, which featured speakers from Boston, Istanbul, Oxford, Princeton, and Research Triangle Park (see image below). To host this panel, which was on the job market, we used the video conferencing software Zoom and we were able to livestream it through YouTube live. The content of the panel was interesting, and the fact that we were able to have participants from so many different locations helped increase the quality of the discussion. We hope the have more multi-location panels in the future.
After departure, we did a number of activities to archive the materials on our website. We also processed travel reimbursements for participants who did not book with our travel agent, and we worked with our local vendors to pay all charges quickly. We also did a better job this year setting expectations for participants about the post-Institute funding that could be available for research projects. We told participants that they would have a decent sense of the amount of remaining funds by August 1, and that applications would be due around that time. We added all participants from our location to the SICSS alumni Facebook group, and we schedule a meet-up during the American Sociological Association annual meeting in New York in August.
Overall, the Summer Institute continues to run smoother each year. We can see that participants—and organizers—are less frantic, which enables everyone to devote their energy to more important things such as teaching, learning, and building community.
Bamberg, Germany (University of Bamberg) organized by Julian Hohner, Thomas Saalfeld, and Carsten Schwemmer (SICSS 2018)
This is the post-mortem for the SICSS 2019 partner site hosted by the Bamberg Graduate School for Social Sciences (BAGSS) at the University of Bamberg in Germany. The institute took place from July 29 to August 9. We provide a brief summary of the different stages of the institute as well as our impression about what went well and what we would try to improve in the future. Our post-mortem might be of special interest to partner site organizers that will not be able to watch live-streams of the main event in real time.
Outreach and application process
While many SICSS partner sites invite applications from PhD students, postdoctoral researchers and untenured faculty, we decided to additionally invite Masters students for Bamberg. For this reason, we had to postpone the event to the end of the summer term of the German academic year, so that students could attend without missing any regular courses. This had the positive effect that it was much easier to find a venue and the negative effect that we could not interact with participants from other SICSS events in real time. In order to reach potential applicants, we distributed a call for applications on social media sites and several mailing lists. We also reached out to administrative staff at several German universities asking them to share the invitation on their local communication channels.
We mostly adapted our application procedure to the SICSS 2018 Google form. For the most part this worked quite well, but a small number of applicants contacted us because they felt uncomfortable with using Google forms for two reasons: they either did not want to create a Google account for uploading files, or they did not want their files to be stored on Google servers. We addressed these issues by allowing such applicants to include links (e.g. for their CV) instead of attaching a file to their application forms. In hindsight, however, Google Forms are not the optimal platform for collecting application materials. What we would also probably change in the future is the requirement for recommendation letters. Especially for students at Masters level, it can be difficult to find someone who is willing to write such a letter.
For sending out emails and asking about information of participants, we mostly copied text from the emails that we sent out by Matt and Chriss for SICSS 2018 and adapted them to our local needs. As resending emails to those who did not respond was time consuming, we like the idea mentioned in the Princeton post-mortem to use a preliminary form to collect information from likely participants.
Overall, organization relating to pre-arrival, including issues around speaker invitations, R programming and reading materials worked quite well. However, there were two exceptions: first, we struggled considerably with administrative peculiarities of the accounting systems of Bamberg University, which are typical of German universities. We are not going to include any details here, as they would not be useful for partner sites outside Germany. Nevertheless, we recommend to allow for a considerable amount of time spent on administrative tasks as one of our co-organizers (Julian) spent many, many hours dealing with administrative obstacles. Second, we had severe difficulties for several months with transferring funds from the US to the bank account of our university. Again, we won’t provide specific details on this issue, but we recommend future partner-site organizers to work on the technicalities of fund transfers as early in the process as possible.
As SICSS Bamberg took place more than one month after SICSS Princeton and most other partner sites, we were not able to rely on live-streamed lectures. One solution would have been to use archived YouTube video lectures for teaching only, but we decided against that idea for two reasons. First, we noticed that unfortunately the quality of some video lectures was not optimal. Judging from some preliminary tests, listening to lectures with suboptimal quality can be exhausting, especially when listening to several of them in succession. Our second reason for not only relying on video lectures was that we wanted the teaching situation to be engaging rather than cinematic. Therefore, we decided to locally teach as much content as possible and only rely on video lectures for additional talks or topics that we were not able to cover with local lecturers.
Carsten gave lectures on the introduction on the first day and dealt with half of the material on ethics. He also covered all teaching related to collecting digital-trace data and text analysis. Content related to ethics and collecting digital-trace data was to a large extent adapted from Princeton, but our text analysis lectures covered some aspects specific to Bamberg. Instead of the mass collaboration day, a local guest lecturer (Oliver Posegga) gave an introduction to network analysis. We relied on archived videos from Princeton for two ethics lectures as well as all lectures on surveys and experiments. In hindsight, we would have shortened the group exercises for surveys, as deploying a crowdsourcing task and going through post-stratification was considered as too much by many participants. One solution could be to move crowdsourcing exercises to the day on experiments, although this might reduce the amount of weekend free time available to participants.
To collect feedback on each day of the first week, we copied the “keep / stop / start” feedback form from SICSS 2018. By far the most frequent feedback was that participants enjoyed the group exercises. Critical feedback was mostly related to issues with our venue, e.g. heat or acoustics. We learned from this that, even for a small group, it can be useful to have access to a mobile microphone that can be passed around. To our surprise, most participants did not consider video lectures as cinematic, straining, or unhelpful. According to their feedback, one of the reasons why they had no problems with watching videos was a continuous effort of the organizers to interrupt videos at appropriate times and ask questions or add comments (e.g. for differences for rules-based ethics approaches in the US and Europe). It was also mentioned that video lectures are more enjoyable at the beginning of a day or after lunch rather than in the evening. Overall, we were quite happy with our mix of local lectures and video lectures, but the success of this approach strongly relied on a sufficient quality of available YouTube videos. Another thing worth mentioning is that we did not experience significant issues relating to possible skill gaps between participants in terms of substantive knowledge and coding, despite hosting a range of participants from Masters students to untenured faculty.
Participants enjoyed the research speed dating (for an explanation of the procedure please read the post-mortem of SICSS 2019 Princeton), and group formation worked quite well without any major issues. We were very excited to notice that not a single group project relied on using Twitter data 😉
Similar to SICSS 2018 at Duke, we asked participants to write up a short summary of their projects, which they should also use to indicate whether funding is needed (e.g. for crowd workers). Our group presentations took place on Thursday, which is why participants had one day less time to work on their projects in comparison to Princeton. We tried to mitigate this by limiting the available time slots for flash talks and by only scheduling two visiting speakers in the second week. There is not much more to report about our second week, as it ran very smoothly.
Post-institute and concluding remarks
At the time of writing we are still in the process of finalizing administrative matters. Similar to what we mentioned above, transnational fund transfers continue to be a time-consuming issue. With regards to research projects, several groups mentioned that they would like to continue their work after SICSS Bamberg. As we have some funding left to spend, we will invite all groups to submit extended funding proposals for their projects to support them in this regard. As for materials, we archived all slides, code and data on Github. Several participants mentioned that they would be happy to use it for teaching purposes, which in our opinion highlights the value of open source commitment.
Overall, we received a lot of positive feedback about SICSS Bamberg from both participants as well as visiting speakers. In our opinion, one of the major reasons behind this was the daily evaluation of feedback and the effort to implement any suggestions on short notice. In general, we conclude that organizing a partner institute that does not take place at the same time of the main event has some disadvantages (e.g. not being able to interact with each other on Slack in real time), but it is certainly possible by relying on a mix of local lectures and archived videos.
To conclude this post-mortem, we consider SICSS Bamberg as a success and really enjoyed to help with building a computational social science community in Europe.
Boston, MA (MIT) organized by Ryan J. Gallagher (SICSS 2018), David Hagmann (SICSS 2018), Eaman Jahani (SICSS 2018), Yan Leng (SICSS 2018), and Sanaz Mobasseri (SICSS 2018)
This is the post-mortem for the SICSS 2019 Boston partner site hosted at the Massachusetts Institute of Technology (MIT). The institute took place from June 17 to June 28. Below, we discuss what we did, what we think worked well, and what we would do differently for various aspects of the partner site, including outreach, the application process, pre-arrival logistics, A/V and room setup, the first week of the institute, the second week, and post-departure logistics.
Outreach and Application
Our partner site was fortunate to have 5 former SICSS 2018 participants as the organizers. At the beginning of planning the partner site, including where we would host it, we were able to divide tasks among ourselves so that no one person took on too much work.
We advertised applications through a number of Boston computational social science and digital humanities listservs (which we were lucky to have some major senior researchers amplify), through social media, and informally through our own social networks. We received 58 applications in total, 29 of which we accepted.
Ultimately, 17 participants were able to join us for the 2 weeks. About half of our applications were from individuals who did not live in Boston (or even near Boston), but we were not able to provide any funding for travel and accommodations, which is why a fair number of those who were accepted were not able to join us. Some others were accepted to other SICSS partner sites as well and decided to attend those partner sites instead. We were content with the size of the partner site though, and we believe that the relatively small size helped us build a more welcoming community. We ended up having a fairly equal gender distribution between men and women (8 out of 17) and diversity among race, though we could have more actively solicited applications from those who are typically underrepresented or marginalized in computational social science through professional groups aimed towards promoting those populations.
We accepted applications through Qualtrics, which worked well for us because some of the organizers were experienced with using that platform. Like many of the other sites, letters of recommendation were received separately via e-mail. All of the organizers met and collectively sorted the applications into Accepts, Maybes, and Rejects. Importantly, we blinded ourselves to characteristics like gender, home university, and position in career trajectory. Fortunately, this worked well for us, and even at the end of the first round we had a diversity of potential participants (when we unblinded). The application review process went fairly well overall, but it took some time for us to all understand what we valued in applications and what we envisioned our partner site to be. All of the organizers seemed to have taken different things from their time at SICSS 2018, which speaks to its impact, but we had to resolve what we saw for our own iteration of SICSS. It would have helped to clarify these different expectations and considerations before starting the review process, rather than having them emerge as we went over more applications. In general, we would be interested in having a conversation with other partner sites about how they evaluated applications and plan to in the future.
We accepted applications for a brief period after the decisions from the Princeton site had been sent out, so that those who were rejected could apply to our site if they wanted to. It took some logistical gymnastics to find a time when all 5 of us organizers were free to review applications though, so we were a bit slow getting decisions back to applicants. We were able to get all decisions out by May. Particularly for those who were traveling quite a distance to Boston, this made their travel plans hectic, and we could do better to more quickly come to decisions, particularly on those who were Maybes in our first round of reviews.
Pre-arrival and Onboarding
Pre-arrival logistics were not too complicated. Admittedly, while we setup an e-mail address for the Boston partner site as a whole, we did not check it frequently prior to reviewing applications. We realized we dropped the ball on this when we reviewed applications and realized we had missed important questions potential applicants had sent us. After that, we were more attentive to the e-mail address, including soliciting bios and photos from the accepted applicants. Since we did not have funding for anyone’s travel or lodging, we did not have to deal with any financial coordination with the participants prior to their arrival.
There was an issue with Datacamp this year because the company covered up sexual misconduct by its CEO, and, so, many individuals in the R and computational social science communities were boycotting its use. This could not have been anticipated, but it may help in the future to have a diversity of resources for learning prior to arriving at the institute.
Probably the most difficult pre-arrival logistics regarded food. We had a lot of miscommunication over what food we would be doing for breakfast, lunch, and dinner and how it would it would be paid for. The payment was complicated because we had two separate grants: one from the site at Princeton that supported our partner site (similar to the ones every other partner site returned), and one from Boston University’s Questrom School of Business. This seems obvious in hindsight, but it would have helped to just make a spreadsheet with each day’s food and costs listed out so that there was less miscommunication. We ended up having lunches catered through MIT, which relieved a lot of stress. We bought all breakfast foods ourselves in bulk. This saved us money, but it meant we needed to many several trips to pick up food during the entirety of the institute. We had enough money to be able to take participants out to dinner most nights.
We were also a bit late on reaching out to local speakers. As we talk about further below, we had a lot of last minute scheduling, and we could have been more diligent with finalizing our speaker list before the partner site started running. This would have helped us better advertise the talks descriptions and researcher bios to participants before their arrival
Audio/Visual and Room Requirements
Our room at MIT worked pretty well for us during the two weeks of the partner site. In terms of the room, we did not have many issues. Most of the issues with streaming was from the Princeton site, and not with our ability to broadcast it. However, we did not have a computer in the room, so we had to find an additional laptop for live streaming so that all of the organizers could continue to coordinate on their own computers while the stream ran.
We livestreamed our local speaker talks, which worked out pretty well in general. One of us, David Hagmann, really took the lead on this and did a lot of last minute work with the software, visuals, and audio to make sure those talks were broadcasted properly.
We mostly followed the schedule of the main site during the first week. We would watch the livestreams of the lectures in the morning and do the group activities in the afternoon. We deviated from the Princeton site sometimes during the times when they had an invited speaker. Instead, we had a number of local speakers that came to give a talk and meet with students. Overall, having local speakers went really well and the participants were definitely more engaged with the live speakers rather than having to watch one via the livestream. We think a big reason this worked is because the livestream talks were recorded so that participants could still go watch them later if they wanted to.
As mentioned briefly earlier, we were late on scheduling a lot of our local speakers though. A fair number of speakers we scheduled after the partner site had already started running. It was fantastic that so many speakers expressed interest in talking and made the time to come to MIT, but we could have avoided a lot of stress of scheduling if we had finalized those speakers in advance of the partner site. This would have also helped coordinate student meetings with the speakers.
We also had more difficulty recruiting a TA than we expected. We were able to get 2 different TAs to help us and split their time among the two weeks, but this was another thing that we had to coordinate during the first week of the institute that we should have been able to take care of earlier.
We forgot some small things during the first day: name tags for the participant and extension cords for laptops. These are easy to fall under the radar, but they are fairly important. Fortunately, we were able to deal with both of these issues by the second day.
Having 5 organizers certainly helped get everything done during the week (setting up the livestream, setting up breakfast, scheduling and working with speakers, running group activities, making reservations for dinner). While the first week was stressful just in terms of running the partner site for the first time, it felt like we were able to distribute that stress evenly so that no one organizer got too stressed.
During the second week, Chris Bail was able to join us and he led the research speed-dating activity at our site, which was fantastic. It seemed like research groups and ideas emerged very naturally and it was a surprisingly smooth process getting everyone started working on the research projects for the week.
We are glad that we had funding to support research projects through small grants. We had participants make proposals for their projects, which we discussed with them and helped them revise so that they would be able to effectively use the money. We were fairly generous funding projects though, so that they could get started and have some results by that Friday. It may have helped to have a bit more structure to this process, even if we just formally wrote up our expectations for the proposals. In general, while participants knew that they were giving a talk on Friday about their projects, it would have also helped to set more regular check-ins with the projects and their statuses. No group appeared to fall through the cracks, but we can imagine how this could happen without more structure.
Because we were unable to fund travel or lodging, we did not have to coordinate any reimbursements with the participants. After accounting for all of our expenses during the two weeks of the partner site, we realized we had enough money to fund a second round of proposals for projects. We wrote more formal expectations for these proposals, and we are accepting them through early September, with the intention for all funds to be dispersed by December. This is an ongoing process, but so far there have not been any major difficulties coordinating that funding with the various research projects that have sent proposals in.
Overall, we feel that the SICSS 2019 Boston partner site was a success, and we are all grateful to have the opportunity to host it and work with so many brilliant participants!
Cape Town, South Africa (University of Cape Town) organized by Vissého Adjiwanou (SICSS 2017)
The second edition of the Summer Institute in Computational Social Science (SICSS) is held from June 17 to June 28, 2019 at the University of Cape Town. It is organized by Dr. Vissého Adjiwanou, Associate Professor of computational and quantitative methods at the Université du Québec à Montréal (UQAM), and Adjunct Senior Lecturer at the Centre for Actuarial Research (CARe) at the University of Cape Town (UCT). He is also the chair of the Scientific Panel on Computational Social Science at the Union for African Population Studies (UAPS). Like last year, this year’s institute is run in collaboration with Princeton University and benefited from the funding from the International Union for the Scientific Study of Population (IUSSP) that support the participation of scholars from outside Cape Town. The summer institute is also supported by the Russell Sage Foundation and the Alfred P. Sloan Foundation. The participants from this year come from various countries and working in different areas. Out of the close to the hundred applications we received, we successfully support the participations of 22 participants (12 coming from outside South Africa, 5 from outside Cape Town and the rest from Cape Town areas).
While following the same agenda of the last year (first week of teaching and second week on project), this year summer institutes innovate by incorporating new materials and adopting new approaches of collaboration. The teaching from Princeton includes topics on ethics, surveys and experiments in the digital age, by Matthew Salganik, and on digital trace data collection and analysis (text analysis, topic modelling and network analysis) by Chris Bail. On site training introduced machine learning to the participants by Nick Feamster, Neubauer Professor of Computer Science and the Director of Center for Data and Computation (CDAC) at the University of Chicago. At the opening, Vissého Adjiwanou presented the importance of the Summer Institute for African scholars and led discussions on issues about causality in Social Science, and considered how new methods and data can contribute to a better understanding of social phenomena in sub-Saharan Africa. This first week also saw various talks by expert on the field: Kyle Finlay presents his works on mapping conversations on Twitter in South Africa, Hussein Suleman speaks about data management, Marshini Chetty on her work to detect endorsements on social media, Megan Bruwer on data visualisation techniques in transportation, Vukosi Marivate presents his various projects on data science and development. Finally, Aldu Cornelissen, an alumni from last year and a teaching assistant talk about his experience and how the SICSS has shaped his career. These various and highly qualitative research presentation made the Summer Institute one of its kind in sub-Saharan Africa, where participants learned from specialists from Africa and around the world.
In the second week, the participants worked in four small groups on various applied project, that were presented at the last day. The first group develop a machine learning approach to predict child mortality in South Africa. This project will be transformed to a grant proposal and while the techniques developed will be applied to data various other databases such as the demographics surveillance site in Africa. The second project borrows from the idea of afro barometer and called Afro-Twitter-Barometer, develops a new tools (shiny apps) that present sentiment analysis about twitter data from various countries in sub-Sahara Africa. The future extension of this project is also very broad: it can be used to monitor epidemic uprising within the continent. How to detect predatory job ads in the web is the focus of the third group that used various techniques and machine learning approach to solve this problem, using data from Gumtree, an online advertiser website in South Africa. Last year has seen a group developing a project to measuring depression and stress-related issues by analysing twitter data. This project is developed further this year with the application of machine learning to identify markers of suicidal behaviours among twitter users in Africa. What all these projects tell us about is the new areas of research that is becoming available to researchers in Social Science, how they can think differently out of their comfort zone and develop new research agenda that affects people life. it is well summarise with this post by Fidelia Dake, a researcher at the Regional Institute for Population Studies at the University of Ghana, and one of the participants of this year: “SICSS taught me to think differently about what data are, the different ways in which data for social science research can be obtained”. Many of these projects were accepted to be presented at the VII African Population Conference to be held in November in Kampala.
Running this summer institute is not without clear challenges this time. Some of the challenges are related to the period that is still academics periods in South Africa. In that instance, many prominent Ph.D students that we have accepted have declined at the last minute to participate. Second, it has been difficult to coordinate the institute from Canada. That has put a lot of activities on myself, and without a local organizer onsite, it will become difficult to sustain the activity. Third, I have been very successful to get funding outside of the SICSS community to support the participation of scholars from outside South Africa, this source of funding is becoming difficult to get, while at the same time, interest for the institute grows within the continent (in evidence with the surge in application). The importance for digital trace data and computational techniques to renew research in sub-Saharan Africa is high, as it will bring new ideas and approaches to deal with various issues facing the continent. Therefore, training like this are very important to continue to be developed. I worked with talented Ph.D’s students mostly from sub-Saharan Africa to conduct new research at the intersection of computational science and health and social science, and new training to hone the capabilities of researcher from the continent. Another initiative is the new centre on computational and quantitative approach to social and health issues (especially on sub-Saharan Africa) that is under-development actually within my department.
Chicago, IL (Northwestern University) organized by Kat Albrecht (SICSS 2017), Natalie Gallagher (SICSS 2018), Tina Law (SICSS 2018)
The 2019 Summer Institute in Computational Social Science at Chicago (SICSS-Chicago) took place from June 17 to June 28, 2019. SICSS-Chicago was hosted at Northwestern University and organized by Kat Albrecht, Natalie Gallagher, and Tina Law. This is the second year that Chicago has served as a SICSS partner site. In this post-mortem, we reflect on 2019 SICSS-Chicago. Specifically, we discuss what happened, what worked, and what could be improved as it relates to: (1) the application and selection process, (2) pre-arrival preparation, (3) logistics, (4) the first week, (5) the second week, and (6) post-departure.
Application and Selection Process
Our application process took place between January and March 2019. We began by reviewing our application from last year. We decided to use the same application as before, which required applicants to provide: a CV, a statement of interest, a writing sample, and a letter of recommendation (for current grad students only). By implementing a more extensive application process we hoped to signal to participants that the institute was a substantial commitment and a fully considered effort. However, in an effort to increase the diversity of our applicant pool, we added a statement to our website explicitly stating our commitment to diversity and inclusion in CSS and welcoming applicants from groups currently underrepresented in CSS, as well as collected demographic information from our applicants. Once we finalized the application and made it available on our website, we reached out to faculty and staff from Northwestern, UChicago, DePaul, University of Illinois-Chicago, Loyola, and Chicago State University to help us to spread the word.
Our application deadline was March 30, 2019 and we ultimately received 39 applications. Compared to last year, we saw a slight increase in applications (n = 39 for 2019 and n = 33 for 2018). There was also greater institutional and geographic diversity in this year’s applicant pool compared to last year, with more applicants from throughout Chicagoland and the Midwest and some even beyond the Midwest and the U.S. Although we do not have demographic information about our applicants from last year, we do have this information for this year. With regard to race, 38% of applicants self-identified as Asian/Asian American, 23% as white, 13% as Latinx, 8% as African American, and 5% as Arab American or Middle Eastern, with 5% not responding. In terms of gender, 44% of applicants self-identified as female, 38% as male, and 8% as gender nonconforming, nonbinary, or queer, with 10% not responding. A number of our applicants also self-identified as queer, international students, and immigrants. A returning organizer noted that the overall quality of applications had improved in our second year.
Our selection process took place in April 2019. Each organizer reviewed all applications and determined her selection recommendation (e.g., “Yes”, “No,” or “Maybe”) based on her own selection criteria. The organizing team then met to discuss their recommendations. We first admitted all applicants that received unanimous “Yes” recommendations and then deliberated on other applications for the remaining spots. We ultimately selected 20 applicants. We notified applicants in mid-April and gave them about two weeks to respond to our offers. We ended up with a cohort of 20 participants, though two participants excused themselves during the first week of the summer institute.
What worked well was our application and selection timeline, the application, and the selection process. There seemed to be sufficient time for applicants to apply, for organizers to review applications, and for a cohort of participants to be selected well before the summer institute. The application itself was fairly straightforward and there weren’t too many questions about it. And the organizers were able to come to consensus about participation selection fairly quickly and easily.
There is room for improvement in terms of recruitment. We have now developed a decent email list for outreach but we can certainly grow this list and ensure that we are following up with people. It would also be helpful to continue collecting demographic information on applicants so that we’re able to assess our efforts toward increasing the diversity of our applicant pool. In addition, we encountered a new problem this year with attrition. In speaking with the two participants who excused themselves during the first week, the attrition doesn’t seem to stem from dissatisfaction with the summer institute but rather from underestimation of the time and work commitment involved. In the future, we may want to make it even clearer to participants of the time and work commitment that will be required.
Our onboarding process took place in May and June 2019. After selected applicants confirmed their participation, we reached out via email to ask them to: (1) provide a short bio and photo for our website, (2) provide information for various logistics (e.g., dietary restrictions, t-shirt size), (3) get set up on Slack, (4) check our website regularly for schedule updates, and (5) prepare for the summer institute by completing either some social science readings or coding training through R Studio Primers and other online resources. We also offered TA help to our participants—through our own TA and the TAs at Princeton. And we made sure to send out a reminder email to participants about a week before the summer institute.
The onboarding process seemed to work out just fine. There generally weren’t too many emails/Slack messages needed, perhaps owing to the fact that our participants coordinated their own travel and housing arrangements. The participants largely provided the information that we requested in a timely manner, though some follow-up work was needed at times. And our participants all seemed to arrive prepared. Compared to last year, we had very few issues with coding preparation, perhaps because most of our participants this year had some experience with coding.
There were a number of logistics that needed to be coordinated before, during, and after the summer institute. Namely, there were six main logistical tasks. First, we worked on getting an MOU and MOU addendum in place between October 2018 to June 2019. This process involved getting signatures from the organizers, faculty advisor, TA, and university financial administrator and ensuring that officials at Northwestern and Duke had all necessary paperwork to execute the MOU. Second, we worked to coordinate the transfer and administration of grant funds between October 2018 and August 2019. This process involved working with our financial administrator at the Northwestern Institute on Complex Systems (NICO) to ensure that grant funds had been transferred by Duke and that we had access to funds. Third, we worked to fundraise between October 2018 and June 2019. We sought to raise funds so that we could offer pilot research funding to our participants, and we were able to secure support from Northwestern Kellogg, NICO, and the Community Data Science Collective. Fourth, we worked to recruit local speakers between December 2018 and May 2019. We sought to invite faculty and community leaders from throughout Chicagoland who engage in CSS-related work, and we were able to host seven local speakers—two of whom are SICSS alum. Fifth, we worked to hire a TA between January and March 2019. This process involved creating and circulating a job description, selecting a TA, and then onboarding the TA. Lastly, we worked to coordinate rooms, A/V equipment, and catering between October 2018 and June 2019. This process involved setting up and negotiating contracts and fees with Northwestern Kellogg and working with their staff to ensure the rooms and catering was set up as requested.
What worked well was that we had prior experience coordinating these logistics and that we received significant help and support from our 2018 SICSS-Chicago co-organizers. Since this was our second year serving as a SICSS partner site, we had a lot of infrastructure already in place at Northwestern (e.g., a financial administrator, process for setting up chart strings, faculty and administrators who know what SICSS-Chicago is). The 2018 SICSS-Chicago co-organizers also provided significant help throughout this process. We would strongly encourage future organizers to (a) retain an organizer from year-to-year and/or (b) establish an “advisory board” of former organizers—we did both this year and we cannot overstate how tremendously helpful this was. Another thing that we tried this year that also worked well was that we hired a former SICSS-Chicago participant as our TA. Because our TA had participated in the summer institute, he required very little onboarding and was very proactive in anticipating and responding to things we needed help with as organizers. We encourage other organizers to seek out former SICSS participants as TAs for this reason. This year, we also sought to invite local speakers from outside of academia—namely, local nonprofits working on critical issues in Chicagoland. This seemed to work well and we encourage other organizers to try it out.
We believe that there is room for improvement in terms of the MOU and grant coordination process. It would be helpful if MOU and grant-related requests were relayed to organizers much earlier and in a more streamlined process. These requests often involve many people and require time, so earlier and more streamlined requests would help organizers to plan accordingly and accomplish these tasks in a more timely manner. Without overstating these issues, we will note that the money from the MOU arrived many weeks after the institute was completed. This did not prevent our institute, since we had strong prior connections with NICO, who agreed to float the cost until the money was processed. We would like to note that this is not an option for every partner site and that lack of timely fund transfer is extremely difficult for all-graduate-student teams to navigate absent these special circumstances. If there are procedures in place to help speed up the process with OSR, please advise organizers of these procedures, particularly if all organizers are graduate students or post docs who have not navigated this terrain before.
During the first week of 2019 SICSS-Chicago, our participants were engaged in the “training” part of the summer institute. Specifically, we met daily to watch lectures being livestreamed from the Princeton site, work on group exercises, and meet with local speakers. We made some modifications to the SICSS curriculum in order to accommodate the fact that we are operating an hour behind the Princeton site and to provide training that is relevant to our group of participants. For example, shortened the daily group exercises by about an hour in order to fit in a local speaker. Based on information from SICSS-Chicago last year, we did not hold SICSS on the Saturday of the first week.
Instead, we provided our own curriculum on Friday in order to offer more training on experiments. And we conducted the “Data-Driven Research Collaboration” (a.k.a. “Speed Dating”) exercise on Friday so that participants could spend the day to plan out how they’d like to make use of their second week. Throughout the first week, we also asked participants to complete “More Of, Less Of” daily assessments so that we could receive real-time feedback. In addition, we had a few social events during this week so that participants could get to know each other.
What worked well was that there was a good balance between livestream vs. in-person activities and that participants were set up well to make full use of their second week on their projects. Participants were able to take part in everything that was happening at the Princeton site, but there were also local speakers and other in-person activities so that the majority of their time wasn’t spent just viewing livestreamed material. We thought that this was important in creating added-value to our participants as a partner site, and we believe that it worked well. We also made sure to get our participants situated in group projects and to help them figure out collaboration-related logistics (e.g., places to work, how to reach organizers) before they set off to work during the second week. We believe this worked well and enabled the second week to run smoothly, particularly since we had a number of participants who were traveling from outside Chicago and even Illinois.
There is room for improvement in terms of preparing participants for the group projects. We initially did not plan to discuss the group projects until Friday of the first week but we received so many questions about the group projects during the first two days that we decided to talk about this on the start of the third day. In the future, it would be best to start off the first day with a brief overview of the projects so that participants can begin thinking about this work as well as alleviate any anxiety about it. Another consideration for moving forward is whether we as a partner site want to allocate funding specifically for social events and potentially for travel and housing costs. Our site does not fund social events and traveling and housing costs for our participants because we have mainly been focused on providing as much funding as possible to our participants for their collaborative research projects, under the assumption that our goal is to help early-stage CSS scholars in the Chicagoland area to collaborate. However, based on our interactions with participants this year, we recognize that the summer institute can also build strong ties among participants through informal interactions and that participants should not have to dig into their own pockets in order to take part in these events. We also recognize that participating in SICSS-Chicago may be cost-prohibitive to some who cannot coordinate their own travel and housing. We haven’t come up with a good solution for this yet but will continue to think about what our goals are and how we can best ensure inclusivity and access as we grow.
During the second week of 2019 SICSS-Chicago, our participants were engaged in the “collaborative research” part of the summer institute. Specifically, participants met with their groups and worked on their projects. We left it up to the participants to determine how they would work. Some groups worked together in-person at sites throughout the city, while other groups coordinated work remotely. The organizers and the TA made ourselves available to the participants though email, Slack, and in-person meetings. We had a few social events during this week as well so that participants could stay in touch with each other as they engaged in more independent work this week. The second week culminated with flash talks, group project presentations, and a farewell happy hour on Friday.
The second week seemed to run very smoothly. Each of the groups developed a workplan that worked best for them and seemed to adhere to it. Providing the groups with this flexibility to determine their own workplan ended up being particularly important because we had several participants who were commuting to the summer institute from out of state. We also provided a number of different ways for the participants to get ahold of us as organizers if they needed help (e.g., email, Slack, in-person meetings), and this seemed to work well. The social events during the week provided a good way for the participants and organizers to stay connected in an otherwise loosely structured week. We had excellent flash talks and group project presentations on the last day.
Since 2019 SICSS-Chicago officially ended, we’ve mainly been carrying out a grant application process for groups seeking pilot funding to continue their collaborative projects. To apply for a grant, we asked groups to submit a project description, a timeline, and a budget to us by July 12. We received grant proposals from four of the five groups. We have reviewed their applications and we hope to provide them with our decisions shortly, pending some accounting processes. Aside from this grant application process, we have been working to coordinate remaining payments. Here, we are also waiting on some accounting processes to be settled. Several of the participants are also working to organize regular happy hours for CSS scholars in the Chicagoland area. This work is still ongoing, though we hope to wrap up these post-summer institute tasks shortly.
Istanbul, Turkey (Kadir Has University) organized by Matti Nelimarkka (SICSS 2017) and Akin Unver
- Applications and selection process
Having organized SICSS in Istanbul for the first time this year in 2019, we spent extra time on outreach and especially social media visibility. We have begun outreach in late-January, by announcing SICSS Istanbul through major social media outlets, and the Kadir Has University has also announced it through its own accounts. Although we expected the first-year applicants to be highly clustered in Turkey, or Turks living abroad, we also wanted to attract as many applications as we can from Central and Eastern Europe, Caucasus, Middle East, North Africa and Central Asia.
We have set up a new Gmail account for applications and handled applicants’ materials through that account primarily. We haven’t encountered any major problems using Gmail as the primary medium of communications. We ran into problems with recommendation letters from participants’ professors; they were either sent late, or forgotten, so we had to proceed with evaluating applicants by their CVs, research proposals and writing samples only. Next year, we consider using Google Forms instead of email submissions to help in the administrative effort. This could also take place across the various SICSS, making it easier for the applicants to indicate which SICSS they’d like to participate and see the range of locations providing SICSS.s
We received 61 applications and accepted 29 of them. For the first year, we sought to be less selective with evaluation process to offer an opportunity to as many scholars as possible. We did this to a) raise awareness on social data science in this part of the world and b) test our university logistics and infrastructure capacity to host a large SICSS group. Next year, we’d like to be more selective as we discovered that participants that we admitted for opportunity’s sake have ended up leaving the program or lost interest towards the later phases of the program. The number of such participants isn’t very high; we lost 3 of them towards the end of the first week. We also highlight the importance of communicating the expected commitment level more clearly and explicitly to participants, such as provide a schedule as already as possible and highlight that SICSS is an intensive two-week program.
- Pre-arrival and preparations
Similar to other SICSS, we used simple pre-arrival process based on coding exercises and recommended reading. The global TAs were available to participants as well and this was communicated to them. Collecting information and website bios, dietary preferences, booking tickets and arranging housing went relatively straightforward.
However, we considered how to make the online component stronger and clearer for participants to demonstrate commitment but also help us to establish a better common ground across participants. Could SICSS be a clearly two-part institute, first part with a clear online curriculum in both social sciences and computer sciences, and the second part as a two-week intensive problem-oriented work. We believe that this could be included in a two-level admissions process: those that are pre-admitted must go through about 1.5 – 2 months of basic coding and fundamental reading preparation and their progress must then be evaluated by the organizers and TAs for their full acceptance to the program. Furthermore, this shift allows us to spend less time with the basics and move onto more research-oriented topics in the first week. We seek to test such process next year.
- Logistics and infrastructure
A major blessing we had this year was the full support of the university chancellor, who directed all campus resources for the use of SICSS. Thanks to this support, Kadir Has University has contributed significantly by providing dormitory housing, free lunches, free coffee/cookies and agreed to re-allocate all SICSS funds provided by Duke University for participant travel reimbursement. We asked participants to purchase their own flight tickets, which we later reimbursed at an 80% cap and the process of guiding them from the airport into the dormitories went relatively straightforward.
Hosting took place at Kadir Has University dorms; the female dorm was right next to the university, but male dorms were a bit far and required daily transportation. Although most people had no complaints about the arrangement, the male dorm was rather isolated from the main event and male dorm participants were dependent on a shuttle at all times. We will explore alternative options for next year, such as hosting all participants, or at least male participants in a closer hostel or hotel. Alternatively, we can ask all international or off-city applicants to take care of their own accommodations, but this year participants very much appreciated the fact that accommodation was free and wanted this policy to continue.
Our main SICSS event classroom this year was specifically designed for the event, with additional power cables, dedicated wi-fi and A/V equipment. Thanks to the university’s responsive IT desk, all of these key infrastructure components worked well, without any interruption. We had a U-shaped seating arrangement, a dedicated projector and computer system, plenty of power sockets, whiteboards, natural light, same room for the entire two weeks and a microphone-speaker system. Plus wi-fi worked perfectly throughout, with no interruptions. I (Akin) think from most angles of logistics and infrastructure we were blessed because the university supported us fully throughout the event and fixed problems very quickly.
- First week
This year’s attendees were clustered more around PolSci and political sociology, which is probably a reflection of the organizers’ main disciplines. Next year, we’d like to work more on diversifying participant backgrounds, by directly reaching out to a wider disciplinary variety of scholars and professors to raise awareness among their respective fields.
Most participants expressed their preferences for more time spent on applied programming and problem-solving, and less on live lectures and lecturing in general. While we understand this, the fact that their programming skills and background were uneven, required sufficient amount time on lecturing so that everyone is on the same page. That is why we’d like to test pre-admission phase next time; getting every first round admitted participant to complete a basic module on R functions, basic statistics and packages, so that we can spend more time on more hands-on research during the main SICSS. The participants thought it would be a good idea to also teach agent-based models, geospatial analysis and network analysis.
Following good experiences from Helsinki 2018, we continued to deliver much of the content through locally instead of live streaming them from other sites. Although everyone was super-excited about joining live to other sites to watch broadcast lectures, they eventually preferred spending more time on in-class activities. The main driver of this behavior was that participants found live lectures too long, and suggested that these would be much more interesting if kept to around 30 minutes of direct, method-related talks. Also, issues with broadcast audio, such as volume and stuttering/cracking sapped participant interest during the first week.
We have decided to not include any local speakers at SICSS Istanbul mainly because site participants were primarily interested in hearing leading scholars of the field speaking in order SICSS sites. We are still deliberating inviting local speakers for next year, because this year there wasn’t much demand for it.
Finally, in terms of materials used even more clear communication how to find and use them could have helped a lot. This could help participants during the second week if they had a clear idea and structure on various materials available. While we mostly used pre-made materials, however developing materials which help in problem solving process (e.g., reading StackOverflow or vignettes to find solution to problems) could build more transferable skills for the second week and live-long learning.
- Second week
Following experiences from Helsinki 2018, we emphasized how to improve facilitation group work stages. We conducted two separate tactics in this: first, the groups wrote a brief research proposal on Monday, where they outlined the existing literature and formulated research questions. Every day we organized standing meetings, where groups discussed their process and highlighted any issues they had in executing the research process. These steps seemed to help shaping the group works and make them more productive.
Participants gelled well this year and were able to establish their research groups relatively fast. There were two less cohesive research groups, but even those groups were able to finalize their research and present their findings at the end of the second week. However, these groups relied heavily on TA help, which, at some point, ended up doing their work for them. At some point, one of our TAs complained that he was doing most of the research for a group, where participants did not complete the pre-arrival training and tried to copy-paste codes without editing them for the purposes of their research. This suggests that the pre-arrival material and one week crash-course were not sufficient to provide these skills to some of our participants, while others were able to take full benefit on those. We have extensively elaborated ideas to solve these in the paragraph above.
We are interested in organizing ‘SICSS meetings’ in Istanbul, to retain a robust network of emerging scholars in countries around Turkey, and turning Istanbul into a regional social data science hub. We believe it is best to see the interest levels of SICSS Istanbul graduates and logistical issues such as the fact that most SICSS participants (even if they are Turkish), came from the US or Europe and regular Istanbul meetings may not be logistically feasible. Similarly, however, events in ASA or APSA might not allow participants from the Middle East and Asia to easily engage with the global community. We believe this community must shape organically, on its own, because at the end of the day those who will remain engaged will be those that will use social data science in their research for the long-run.
- Organizing SICSS away from home
This year, Matti Nelimarkka from the University of Helsinki was co-organizing the event at Istanbul. This was his first time visiting Istanbul. Organizing educational activity outside one’s home location might create additional issues. However, there were surprisingly little such issues: all of them could be solved in a few days before SICSS (such as checking and organizing the classrooms and discussing the learning goals and activities for each day together). I (Matti) believe that the easiness emerged from the high commitment at the Kadir Has University to organize this event. For other summer institutes organized using the same approach, we had open communication with a few teleconferences between me and the local organizer team before the SICSS to help in the organization.
Los Angeles, CA (UCLA) organized by Alina Arseniev-Koehler (SICSS 2018), Pablo Geraldo, Bernard Koch, Friedolin Merhout (SICSS 2018), and Marcel Roman
This is the post-mortem for the SICSS 2019 partner site hosted by the California Center for Population Research (CCPR) at the University of California. The institute took place from June 17 to June 28, concurrently with the Princeton institute, and used facilities at the Luskin School for Public Affairs. Topically, the Los Angeles Institute added an additional focus on Machine Learning and Causal Inference to the broader computational social science schedule common to all SICSS partner sites. The following report provides a brief summary of the different stages of the institute, of what we, the organizers, perceive to have gone well, and where we experienced challenges.
Applications and Admissions
The SICSS-Los Angeles partner site faced a somewhat unique set of circumstances regarding timing, location, and funding that broadly influenced the overall application and admission processes. First, the partner site took shape after most other sites had already formed. Given this, we were working with a shorter timeline for the applications and admission processes which led us to simplify the application procedures substituting a one-page letter of interest for the writing sample and letter(s) of recommendation other partner sites required. Second, as the only partner site on the West Coast and given the Institutes location in commuting distance to more than a handful of institutions of higher education, the partner site was serving a very large pool of potential participants. Finally, the partner site had a close association with CCPR and a unique, additional topical focus. This increased the appeal to a broad but specific audience affiliated with the center and or interested in causal inference and machine learning.
In addition to these site-specific factors, we engaged in a very proactive recruitment strategy using the ASA methodology section listserv, the Google Computational Sociology group, and various social science departmental email lists at major universities across California. Also, the partner site benefited from a long list of distinguished guest speakers which additionally increased the interest among potential participants. All of these factors contributed to a very robust and diverse pool of applications, despite our inability to offer financial assistance for travel or housing like some of the other sites.
In our approach to admissions, we strove to be comprehensive and assess all applicants holistically. That meant three of the organizers reviewed all the application materials for each of the 72 applications independently and then combined their assessments to make final admissions determinations. In the independent application reviews, we paid attention to how well the letter of interest articulated what applicants hoped to take from participation, whether applicants promised to benefit from participation (i.e. weren’t too advanced), and how they might contribute to disciplinary and institutional diversity among the participant pool.
In all, we admitted 41 participants which placed our partner site at the upper end of Institute sizes. To us, this number seemed to be strike a reasonable balance between extending this opportunity to as many people as possible without running the risk of group dynamics that might undermine the learning experience. Additionally, we reasoned that partner sites experience higher attrition, so a large group to start was desirable to attain a good effective number of participants.
Overall, we did not experience major issues with the applications and admissions process but did find minor areas for improvement. First, the specific expectations for the letter of intent could be stated more clearly. Since this was applicants’ only chance to present themselves beside their CV, it had a strong influence on admission decisions. The letters we received varied widely in detail and thoroughness, and with clearer expectations decisions could be more easily justified. Second, while the number of participants did not affect our budget, the admission process might be streamlined with a preset goal. Finally, we used a dedicated email account for the institute which we could have used more efficiently by clearly distributing monitoring responsibilities.
For our pre-arrival preparations we relied heavily on the resources provided by the main site. We generally used the main site emails as templates and adjusted the language as appropriate for our situation. In our case, with one key organizer not being local, the work of figuring out the specifics of the partner site rested mainly with one other organizer. Luckily, we had great support from the CCPR administrators which immensely reduced the logistical work for the organizers.
Overall, we aimed to hit the sweet spot between too many emails, too long emails, and too little information. In our case, this meant one major email to all accepted participants after admissions were completed and another reminder email the week before the institute. Similar to the main site, the former email contained primarily information about what we expected participants to do before the institute and what they could expect from the institute. The latter email reiterated logistical information to remind participants of all they needed to know to arrive prepared and on time for the first day of the institute.
Given our experience, there were at least three aspects that we would try to improve upon in future iterations of the institute. First, participants did not appear to take advantage of the pre-institute learning opportunities which led to some hiccups during the beginning of the institute. While the particular situation surrounding DataCamp might have contributed to this, we think this could be improved by reviewing and following up with those individuals who indicated no or little experience with R or a programming language in the weeks leading up to the institute (and similarly for those who indicated having no social science background). Second, we did not appear to have succeeded in converting participants to using Slack which impacted communication before and during the institute. By requiring participants to send us a gif on Slack or dedicating a short segment at the beginning of the institute, this might be improved upon. Finally, with any partner site that draws a largely commuting participant pool, parking will be an issue. By communicating the options and alternatives to individual traffic more clearly up front and potentially even querying it during the application process, this might be solved more elegantly.
Since we realized that we would not be able to recreate the engrossing holistic experience of the main site or some of the other partner sites, our aim with SICSS-Los Angeles was to provide a topically unique experience. To this end, we modified the schedule for the first week keeping about 80% of the main site’s content and supplementing it with lectures, exercises, and guest speakers on Machine Learning and Causal Inference. This topical focus appeared to resonate with a wide swath of applicants and participants but also required us to make some tough decisions about cutting and or condensing content from the well thought out general SICSS schedule. One blessing and curse that our partner site had in making this condensed schedule work was the time difference, which allowed us to skip and or localize Q&As that might be less engaging, make use of Youtube’s replay speed options, and generally skip over downtime in the stream.
Given that we were only meeting Monday through Friday, our schedule was already too tight to allow us to cover all materials from the main site. Once we did cut some content, we also freed ourselves mentally to rearrange some of the organizational elements in a way that seemed more beneficial to a partner site experience. Specifically, we moved the Research Speed Dating exercise to Friday morning which allowed participants to 1) have a known goal and purpose for returning for the second week and 2) use the weekend to arrange their group’s meeting schedule, since commuting to the Institute site was not practical for everyone.
Another partner site issue we aimed to address is the less engrossing experience of watching pre-recorded lectures compared to watching them in person or live. Our approach to this issue was to replace some of the recorded content with in person lectures by the organizers. As hoped, this solution was more engaging for participants but, as the lectures from the main site are top tier, also required a substantial amount of preparation to match the quality of instruction. By distributing the instruction load across multiple organizers we made these time commitments feasible for all involved but also ended up with uneven quality of instruction. For future iterations we would strive to agree on clear standards and ideally review the local content before the start of the institute.
Our second week was marked by a lot of flexibility which benefited some of the groups that formed on the Friday prior but also led to some unanticipated issues. We aimed to provide access to the same facilities used for the first week of the Institute during business hours the second week, while reducing demands on organizers by having one to two organizers continuously present. This mostly worked out despite some organizational hiccups with accessing the facilities. However, the discontinuity in organizers present was not optimal for advising research group projects since no one organizer was aware of all that had been going on. For future iterations, such continuity in the second week seems desirable.
In addition to moving the Research Speed Dating to the Friday of the first week, we had to move the research funding process to the second week for administrative purposes. This had the beneficial side effect of groups being able to draw on organizers to give feedback on the feasibility of the proposal. Unfortunately, it also distracted some participants from the research projects their group had decided to pursue and left others scratching their heads for what resources they might require or ask for. If the budget and timeline of future iterations allows it, it seems to be better to follow the funding timeline set by the main site.
Overall, the groups that completed projects during the second week were very successful and we were impressed with the wide variety of projects that emerged from the institute, which drew on an even wider set of computational methods and sources than we had managed to cover during the first week. An unfortunate development we experienced was that, as the day progressed, more and more participants left. This was partially due to outside commitments that could not be changed but for some also appeared to result from a lacking sense of Gemeinschaft. This is something we as organizers would hope to improve upon in future iterations by emulating some of the social events surrounding the Institute that the main site and partner sites outside of the US have been exemplifying.
While many groups and participants have expressed interest to pursue the projects started in week 2, this might be a challenge as everyone returns to their respective home institutions. One fortuitous impulse we were able to set to encourage the continuation of the projects was to invite a few of them to present at a CCPR event one month after the conclusion of the Institute. This is clearly not feasible for all partner sites. Where it is possible, however, inviting groups to present at a later date seems like a worthwhile endeavor to promote the growth of the networks and the projects the Institutes help to start.
Overall, the Los Angeles partner site of the Summer Institute for Computational Social Science was a success in showcasing and bringing together the emerging CSS community in (Southern) California! We want to thank Chris Bail, Matt Salganik, and Jennie Brand for providing us and the participants with this unique opportunity, and CCPR, Ana Ramirez, and Lucy Shao for their generous financial and outstanding organizational support.
Monterrey, Mexico (Universidad Autónoma de Nuevo León) organized by Oscar Mendez (SICSS 2018) and Adaner Usmani (SICSS 2017)
We wrapped up the Summer Institute of Computational Social Sciences at Universidad Autónoma de Nuevo León (Monterrey, Mexico) on Friday, June 28th of 2019. This SICSS at Monterrey is the first of its kind in Latin America. As such, while relevant for all future summer institutes, we expect this report to be most useful to future summer institutes in the Latin American region in general and in Mexico in particular. This post-mortem report summarizes what happened, what went well, and what could be improved for next time.
Oscar Mendez (Assistant Professor at UANL) and Adaner Usmani (Postdoctoral Student at Brown University) organized the SICSS 2019 at Monterrey. A total of 21 participants (16 graduate students, 1 postdoctoral student, 4 assistant professors) attended the SICSS 2019 during the first week. While one hour earlier due to the time-zone difference, we managed to follow the morning lectures by Matt Salganik and Chris Bail in real time. We spent the afternoons working on the group activities, supervised by Oscar, Adaner, and Edgar Luna, our TA.
During the second week we asked participants to prepare an outline of the project they would like to work on along with an approximate budget. We did not ask for a final product by the end of the second week, mainly because most graduate students were in the middle of their school quarter (more about this below). Instead, we gave them two months to work on their proposed projects, with the promise that the funds to support them, as well as our technical support, would remain available throughout. Although the project part of the institute was voluntary, we were pleased to receive five team proposals prepared by 15 students.
What Went Well
Computational Social Sciences is a very new concept in Latin America. A lot of what is being done in other countries, especially the U.S., is only starting to trickle into the region. Thus, our main motivation for this SICSS in Mexico was to introduce young scholars to these new tools and ideas to get them excited on the prospects of using them for projects and future research. In that sense, we consider that we successfully reached our goal. Students thoroughly enjoyed the experience of learning new methods during the first week—as evidenced by their enthusiasm during the activities and their overwhelmingly positive feedback of the institute—and they are already looking for ways to apply the newly acquired knowledge. Some have also shown interest in inviting the schools they attend to apply to become SICSS partner institutions.
We were positively surprised by the diversity of our attendees. Out of 21 participants, 10 were women. Along with our Mexican students (18), we received participants from Brazil (1), Argentina (1), and Venezuela (1). Participants represented a total of 7 universities and 5 disciplines (economics, engineering, sociology, technology, and business and accounting).
All technical aspects worked out well during the first week. The streaming video and audio were great and the resources we made available for participants to work on the afternoon exercises worked smoothly. As we expected, only a few of our participants had any programming experience with R. To make things slightly easier for everyone we organized a three-hour R mini workshop to teach them the basic commands. We think many participants don’t spend enough time working on the pre-arrival material, so a workshop like this, which we did the Sunday before the beginning of the institute, helps people familiarize with the basic functioning of R so as not to feel too intimidated by lectures and exercises. We also replicated what SICSS NYU did last year and placed at least one person with programming experience in each group.
The rest of the logistics—food, coffee breaks, space for working in groups, etc.—also went smoothly. The support of university staff had a lot to do with that. We highly recommend getting university staff involved in the organization. We also recommend organizing an ‘inaugural dinner’ where participants can meet each other, TA, and organizers in a casual setting. Our dinner was set for Sunday, June 16th immediately after the R workshop. This ensured high attendance.
Leading during the activities in our native language (Spanish) made our participants feel more comfortable, but forcing them to use English when presenting was a challenge that they ultimately enjoyed and deemed useful. Finding the right mix of language usage is something we highly recommend thinking about. We were lucky our co-organizer, Adaner Usmani, also speaks Spanish.
Initially we scheduled two local speakers, but in the end we only had time for one. While the local speaker’s presentation was very good and it sparked lively conversation and Q&A’s, we think the summer institute is too intense to have one talk every day on top of 6 or 7 hours of lectures and activities.
Since in Mexico the government requires a document more formal than a receipt (called “factura”) for tax purposes, handling reimbursements and other expenses can get tricky. We recommend reaching out to Oscar (UANL organizer) for more details and tips on how to deal with these situations. Travel costs for the visiting external organizer were split between the university and SICSS funds. We didn’t have issues reimbursing expenses for travelling participants and organizer.
We highly recommend having more than one organizer or TA, or both, supporting students during the activities. Two people don’t seem enough. Even with three we still felt somewhat undermanned at times, but it was manageable.
What Could be Better
Here we discuss specific things that could be improved for future summer institutes.
While pleased with the number and quality of our participants, we think the outreach could be greater and reach different areas in Mexico and other countries. For example, more emphasis could be placed on reaching potential participants in the Mexico City area and southern states. Inviting participants from Central America would allow SICSS to reach locations that would have a hard time having access to these resources.
For SICSS Monterrey, the application packet consisted of a ‘letter of intent’ and a resume. As SICSS becomes more popular in the region, more care should be placed on quality of participants during the selection process. Adding recommendation letters would help with this. Also verifying that participants are in a semester system rather than quarters, since attending classes while the summer institute is on is not ideal and prevents them from working on projects during the second week.
At UANL School of Economics the Wi-Fi signal does not reach classrooms and other multimedia rooms. Hence, we had desktop computers ready for participants. This is not ideal, so for future institutes we recommend having areas where Wi-Fi is available. We managed by preparing areas with Wi-Fi access for group activities, but most groups decided to work directly on the desktop PCs. This ultimately was not a big issue, but it is an area that could be improved.
Before the summer institute started, one accepted participant from UANL informed us that he would not be able to attend. During the first week one other participant excused herself for health-related reasons. Two other participants missed two full days. During the second week 6 participants decided not to prepare a project outline. While difficult, it would be a good idea to think about ways to make students commit to fully attend the SICSS’s first week (at the very least), and encourage them to prepare projects for the second week. Making attendance and project mandatory before selecting participants would probably fix a good proportion of these situations, but many students in Latin America find it difficult to secure funds to finance their whole stay. SICSS funds are not enough to cover two-week stays for more than a handful of people. Hence, one should be careful when balancing the mandatory aspect with the number and type of participants we expect to attract.
Oxford, United Kingdom (Oxford University) organized by Taylor Brown (SICSS 2017), Nicolo Cavalli (SICSS 2018), and Ridhi Kashyap (SICSS 2017)
1) Outreach and application process;
Application process: We decided not to require a letter of recommendation to facilitate the application process and make it easier for participants earlier on in their academic programmes who may not have had a letter easily available. However, since we did not forbid them, we received some letters of recommendation. This required us to establish a procedure to guarantee equality of evaluation across participants. In future iterations, it may be better to specify clearly whether letters of recommendation will be assessed if they are submitted as a part of an application, particularly when they are not required.
Initially we left around 4-5 weeks between our call and application deadline. Sometime towards the end of March we revised the application deadline from March 31st to April 15th, as we felt we had not left sufficient time to allow for a wide application pool. We suggest to leave at least 6 weeks between release of the call and application deadline.
Outreach: Once we prepared our call for the website, we did targeted outreach to several professional associations (e.g. The International Union for the Scientific Study of Population; Economic Social Research Council); departments within and outside Oxford, especially within the UK and Europe. In addition, the core organizers (Matt and Chris) provided announcements for our specific location on social media, which also helped our visibility.
Selection process: We received 182 applications (18 internal applications from Oxford) and accepted 24 (12 from Oxford). Our acceptance rate was 13%. Due to funding constraints and limited accommodation availability, we were only able to accommodate 12 participants who were external to Oxford. We were keen to have a mix of internal and external participants, and we believe this 1:1 ratio struck a very good balance.
We selected participants in the following way: first, each organiser went through all applications separately; second, organisers met to discuss each application and produce a shortlist; third, each organiser reviewed applications from shortlisted applicants; fourth, organizers met to discuss shortlisted applications and produce a final list of accepted applicants.
2) Pre-arrival and onboarding;
Prior to arrival, we requested that participants provide a picture and short bio for the website. This allowed participants to get to know one another beforehand. We also requested that they login to Slack and familiarize themselves with the platform, as it was new to several participants. Finally, we asked for any dietary restrictions, so as to accommodate these needs and preferences during mealtimes.
On the night prior to the start of the institute, we had an informal welcome dinner and drinks at Nuffield College.
We offered accomodation in 2 graduate Oxford colleges (Nuffield and St Anthony’s) to 9 external participants, and subsidised accommodation for 3 external participants. Even though we had the monetary resources, it was difficult to find affordable housing in Oxford during term time. In our attempts to synchronize with the core Princeton site, as well as other logistical issues, we decided to hold SICSS June 16-28, which was still during the academic term in Oxford. In order to have more flexibility in finding housing for external participants, we may have to consider moving to a later date (e.g. July) in Oxford for future iterations.
Meals at Nuffield college were also provided. Although internal participants were not offered in-college accomodation, they were invited to meals in-college. Shared meals fostered informal discussion between participants and generally assisted in coordinating schedules and communication (e.g. changes to the schedule, etc) throughout the two weeks. Several participants appreciated the high quality of meals provided in the college, and the college was generally very good at catering to different dietary requirements/needs.
3) A/V and room requirements;
For A/V, we relied on the IT team of the Department of Sociology of the University of Oxford, which hosted the event. The department recently moved to this new building and this was the first time that a large event such as this was being organized in the lecture room there. The room was equipped with two large touch-screen monitors, which had internet connectivity, mirroring functionality, and could also be used as whiteboards. Furthermore, the room was equipped with a camera and automatic video capture functionality to enable video recording and live-streaming through the software Panopto. We were thus able to record whatever sessions we were granted permission to do so by the speaker. The IT staff assisting us were incredibly attentive and helpful in troubleshooting any problems–though the problems were few.
In the future, we would hope to use the same room, but we would provide large tables for work sessions. We found it very important to provide enough power outlets to charge laptops. This was generally achievable using extension cords, though it did require participants to cluster near the outlets. An unexpected benefit of our location was the proximity to fresh air, as the room was close to the outdoor patio of the department. There were also many windows in the room, which allowed for natural light and an orientation to the time of day. Two weeks in instruction and computer work can be taxing both physically and psychologically (especially for those participants who have already travelled from far away and are jetlagged); the ease of access to fresh air and sunlight during coffee and tea times was very important.
4) First week;
We started our opening day in Week 1 by asking participants to prepare lightning talks to introduce themselves and their interests to each other in the morning session. We also held a panel on “Tales from previous SICSS” in which Francesco Rampazzo (SICSS 2018 alumni) delivered a presentation on his SICSS project.
In terms of instruction content in the first week, we broadcast live-streaming sessions from Princeton and, conditional to the availability of local expertise for teaching in a particular domain, hands-on workshops and lectures either conducted by us (the organizers) or other experts in Oxford. Having local lecturers was a value added, as we found that it generally increased the engagement of our participants. We also found that it is important to guarantee high-quality livestream from the main site, which was not available on the first day, where the live-stream quality was poor; we found engagement was low on this day. In general, we believe that our approach to use the livestreams as supplements, as opposed to the sole or most substantive content of the training, turned out to be positive.
In a slight difference from the main site programme for Week 1, we split the exercise on the day covering non-probability sampling and post-stratification into two, based on feedback from the previous year that the exercise on this day had been quite time-consuming/demanding. Instead, we covered MTurk on experiments day (on Saturday) and provided participants with a pre-collected dataset already on Friday (non-probability samples day). This worked very well.
We organised an “Unconference” (unstructured conversations centred around organically emerging topics) to allow participants to start discussing research ideas for Week 2 already in Week 1. We believed this was a successful addition to the first week and also helped participants get to know each other and their research interests well.
We found that Feedback forms were frequently not completed despite our repeated reminders to do so.
5) Second week;
In contrast to the main site, participants in Oxford had Monday to Thursday morning to work on group projects with group presentations scheduled for Thursday afternoon. On Monday of Week 2, we performed minimum/maximum similarity clustering of participant interests, as per the main site’s procedure. This exercise fostered an abundance of ideas, a subset of which eventually became the group projects. In the following days, the groups used five rooms at the Department of Sociology and in Nuffield Colleges for their work sessions. We think it was a good idea to book separate rooms, as this gave groups a specific space to report to and work in over this week. We relied on Center for Experimental Social Science (CESS) at Nuffield College to help us disburse funds to research grounds that relied on online platforms such as MTurk and Prolific.
Notably, the group composition for the research projects confirmed the potential of SICSS to bring together internal and external participants: all except one group had both internal (Oxford) and external (non-Oxford) members, and the one internal group brought together two scholars who had not previously worked together, and who expressed that they would not have otherwise worked together. Moreover, while we were concerned that local participants would not attend Week 2, the fact that external participants were contributing to the majority of projects may have partially sustained the ongoing engagement of internal participants, who continued to arrive early and work throughout the day on group projects. Attendance did not drop in Week 2. In addition to lunch breaks, tea and coffee breaks in the afternoon in Week 2 provided groups a chance to connect with each other. Furthermore, we had also arranged research talks either before lunch or after lunch on most days of Week 2, which helped provide structure to the day.
An additional activity that we organised during the second week (Monday) was a post-dinner discussion led by Daniela Duca from SAGE Publishing, London. Daniela illustrated a range of CSS activities promoted by SAGE Ocean. These include a set of scholarly-oriented funding opportunities, focused on supporting the development of new tools to help social scientists who work with big data.
On the last night of the institute and after the group presentations, we had a formal dinner at Nuffield college.
MTurk Experiment on Behavioral Change
Fjinanda van Klingeren (University of Oxford)
Giacomo Vagni (University of Oxford)
Experiment on how people’s opinions/behavior toward vegetarianism change due to different content on food packaging.
I, I follow, I follow you: cross-website networks of music tastes on Twitter
Adam Kenny (University of Oxford)
Vadim Voskresenskii (Free University of Berlin)
Examination of Twitter followers for the top 20 artists on three online music magazines (Billboard, Pitchfork, Quietus) to assess the connection between communities of listeners.
Election Prediction Post-stratification for Conservative Party Leadership Election (United Kingdom, 2019)
Florian Schaffner (University of Oxford)
Rishemjit Kaur (CSIR-Central Scientific Instruments Organisation)
Aidan Combs (Duke University)
MTurk survey and post-stratification to predict the results of the Conservative party’s leadership election in the United Kingdom in 2019.
Relationship Advice on Reddit
Julia Mikolai (University of St. Andrews)
Emmanuelle Afaribea Dankwa (University of Oxford)
Xuejie Ding (University of Oxford)
Judith Koops (Netherlands Interdisciplinary Demographic Institute in The Hague)
Examine relationship problems posted and responded to on Reddit, by age, gender, and sexual orientation
The impact of AirBnb on Community Real Estate Prices
Clemens Jarnach (University of Oxford)
Tobias Rüttenauer (University of Kaiserslautern)
Paola (University of Oxford)
Timothy Monteath (London School of Economics)
Kayla Schulte (University of Oxford)
Samira Barzin (University of Groningen)
Examine the presence of AirBnb across London neighborhoods to understand the impact on real estate prices and other community-level variables.
Understanding inequalities in common knowledge as generated on Wikipedia
Arun Frey (University of Oxford)
Arran Davis (University of Oxford)
Pablo Beytía (Department of Social Sciences of Humboldt University of Berlin)
Jorge Cimentada (Pompeu Fabra University)
Mark Verhagen (University of Oxford)
Chris Barrie (University of Oxford)
Investigate the determinants of city Wikipedia articles and the extensiveness of their content.
We have been collecting travel reimbursement requests through our sicss-oxford email account. We have also been working with various finance teams in the department and in the college to organize and allocate payments through different grants. As a general issue, perhaps similarly to other sites, the process of planning SICSS has involved considerable administrative workload in terms of figuring out where best to access specific services, or how to access/disburse particular funds.
7) General notes
By accepting both Oxford and non-Oxford participants, it was our hope that SICSS would provide an opportunity to (1) consolidate an Oxford computational social science community, and (2) connect the Oxford community with scholars, resources, and networks outside of the university in the area of computational social science. We believe that this occurred. As previously mentioned, the composition of the groups projects demonstrated that both inter- and intra- university bonds were formed between participants. Moreover, the decision to house external students in-college, and to provide meals in-college for all participants, allowed for bonding that is not likely to have occurred if participants were placed in separate housing throughout the city, or if only internal students were invited to participate. Many of the conversations at mealtime and in after-hours socializing engaged in topics that exposed participants to the strengths and resources of their different backgrounds. In addition to and in support of the professional opportunities fostered, the mix of internal and external participants also created a unique social life for the institution. Several Oxford participants organized events that would not have happened if only Oxford students, or only external student, had participated. These included tours of the university and colleges, game nights, picnics, and pub outings. These activities served as important, informal networking and bonding moments for the participants.
New York, NY (Hunter College-CUNY) organized by Maria Rodriguez (SICSS 2017), Gleneara Bates-Pappas (SICSS 2018), Sebastian Hoyos-Torres (SICSS 2019)
Budget & Funding
- Funding sources: We had funds left over from SICSS 2018, so we pulled those together with this years funds. In addition, we received $10k from the Dean of the Silberman School of Social Work to offset any additional costs. Lastly, the site host used some of their own research funds to cover a few reimbursements to students for travel.
- Future Funding Considerations: For this year’s SICSS we strived to make it as accessible to everyone we thought should attend. We paid for 2 child care RAs, we provided CUNY dorm housing and reimbursed for AirBNBs for folx who needed housing (the airbnbs were cheaper). These accessibility attempts accounted for approximately 45% of our overall expenditures. This makes it unlikely that SICSS 2020 could occur at Silberman without 1) more funding or 2) a decrease in the amount of students we admit.
The recruitment process for speakers went quite well. The Hunter SICSS team created a spreadsheet with a list of possible speakers. The search was focused on social work scholars and advanced practitioners who are currently doing computational research of some kind – whether using VR for training students, social media data, or who use R or a similar coding language for their research. After identifying ten possible speakers each speaker was sent a personalized email asking them to speak at Hunter SICSS. Five of the speakers responded very quickly and were enthusiastic about speaking. One of the invited speakers had several conversations with the RA but it was decided that this speaker current research was not a good fit for SICSS-Hunter this year. Four of the invited speakers did not respond to the invitation. A member of the team followed up with these speakers via email twice and did not receive a response. We recorded each lecture, but are still awaiting video processing from the graphics department before we can make them available. All speakers signed consent to have the video shared publicly, and most provided their slides for sharing as well. We also reimbursed speakers for travel and lodging, as appropriate.
Recruitment: We received a total of 36 applications. We accepted 26 of those. Our selection criteria focused on three key things: 1) the applicant was a doctoral student or applying for doctoral programs, 2) the applicant was untenured faculty at the time of application, 3) the applicant committed to attending all sessions of SICSS. Our attendants came from 8 institutions and represented health and social science fields of study. Importantly, several of our applicants were from public universities. The Hunter SICSS team is curious as to the institutional breakdown of other sites and the propensity for applicants from public institutions to be accepted to them. Our hypothesis is that partner sites hosted at public institutions have greater diversity of institutional affiliation.
Attrition: All of our attrition was due to circumstances beyond the control of the participants and host site, which we surmise to be a very good indication that the quality and care that we took to build an accessible SICSS site worked.
Three applicants who were selected and confirmed attendance, later declined to attend for the reasons listed below:
- 1 due to a family member’s unexpected passing
- 2 due to attendance needs that we could not meet (both required visa support and full reimbursement of international travel)
We lost an additional 2 students after Hunter SICSS was underway: both during the first week. Reasons given are noted below:
- 1 due to personal illness
- 1 due to family member’s unexpected passing
Childcare: We provided childcare for 1 student who needed to bring their children (ages 6 and 10) with them. There were varying opinions about whether we should have done so or not, but we saw it as a way to do a few things: 1) Provide short-term employment for 2 MSW students and 2) Provide a worry-free learning space for a parent. We kept a 1 to 1 ratio of child to care-provider, ensured care providers had up-to-date NY state child abuse clearance and mandated reporter training certificates, and had the care-providers create lesson plans for the each day, some of which are below as exemplars.
Housing: We had 2 students request accommodations at CUNY dorms, while 6 others were provided reimbursements for securing their own housing options. We believe this made a trip to NYC both financially feasible and less stressful than otherwise. We also feel this allowed for more participants to travel from institutions that were further away, and did not have a closer partner site option.
Each day we asked students to complete the standard SICSS daily evaluation: a short answer google form with three sections titled ‘Keep’, ‘Start’, and ;’Stop’. All responses for each day are included below.
Overall, participants were positive about the daily schedule, food, group work times, and TAs. Participants suggested having more TAs in the future, but most of the issues were present only on the first day.
Things we Can Improve
- Live Demos: Participant evaluations were generally positive of all the live demos done. All but one of these was done using a live-coding style, similar to The Carpentries approach (although without the pedagogical rigor). One way to improve the outcomes of a potential next SICSS is to try and tailor the demos to methods connected with the video lecture. Perhaps this would also motivate some participants to stay off their computers during the video lectures, which we thought was a bit excessive but did not seem to impact the quality of the final presentations.
- Housing: The reimbursements for housing became pricey quickly, but it’s unclear if removing them would be detrimental to the accessibility of the institute. If main site organizers have recommendations for other streams of funding, we would love to know about it, as we think housing in NYC is key to making SICSS accessible to as many people as possible. The Silberman administration is committed to making SICSS happen again in the future, and is open to exploring institutional streams of additional funding as well.
- Slack: While we encouraged participants to join slack ias early as May 2019 (one month before Hunter SICSS), and multiple times, participants were slow to adapt slack. However, once they became more familiar with it, we saw an increase in use. Many of the participants did not take full advantage of slack, more specifically #officehours. Further, there has been no post Hunter SICSS activity on Slack, which may indicate a need to explore other platforms.
Do we do another one?
SICSS was worthwhile and valuable to do again with some caveats in mind. First, if we were to do it for next summer, we should ask for a letter of recommendation from participants. As we discussed, this will raise the bar in a manner which potential participants will have to make a time investment which isn’t necessarily cost prohibitive for participants who aren’t from high income backgrounds.
Some students came to SICSS with little to no background in R which made it difficult for some to keep up with the lectures. If we were to conduct another SICSS, we should provide a section that briefly ensure that some of the prerequisites are met so that some students can keep up with the video lectures.
It’s important to note that more than half of the participants identified as people of color. One of the participants shared that they would not have been able to receive this training anywhere else because of the cost. They noted that the only reason they applied to SICSS was because it was free. They went on to share that they were grateful that nutritious food was provided, because they don’t have summer funding and they currently have $500.00 in their bank account and knowing they would receive 3 meals for two weeks allowed them to focus on learning and not worry about how they were going to eat.
Host Faculty thoughts
The budget for this site shows that funds used to increase accessibility and minimize stress can serve to provide a superb learning environment for SICSS partner sites. Having a good number of basic needs taken care of allowed students to produce some fascinating, innovative work that we hope is carried forward. One area that needs immediate attention is how to create a forum for SICS graduates to remain in conversation. The Facebook group has served well to post open positions and the like, but I wonder if there is a way to create an annual conference, whereby folx might be able to present continued exploration of their SICSS projects, or perhaps present new ideas they began while at SICSS especially for those who do not attend ASA or APSA, this might prove valuable in strengthening the academic community and may lead to other venues that provide greater exposure for CSS – such as open-source journals, or the like.
Sample Child Care Lesson Plans
Monday, June 17th, 2019:
9:00-10:00 First day we were getting comfortable with the kids and did some origami and made some fortune tellers as per the children’s request and paper boats.
10:00-11:00 We did some coloring and crosswords puzzles.
11:00-12:30 Went to the patio in school, used chalk to do mathematical questions and played some games which included hide and seek, telephone and hop scotch.
1:30-3:0 Computer time which included nitro typing and other educational games.
3:00-4:15 Nap time.
4:15-5:30 Creative time.
5:30-6:00 Light dinner.
Tuesday, June 18th, 2019
9:00-10:30 Journal time. Write prompt for both: write about the previous day and what you enjoyed doing? What else would you like to do? Did what are somethings that you noticed in New York City?=
10:30-11:30 Book reading time.
11:30-12:00 Book summaries and questions.
12:00-12:30 Some coloring and fortune teller games.
1:30-3:00 Computer time including nitro typing and other games.
Wednesday June 19th, 2019
9:00-10:30 Journal Time. Writing prompt for Tish was if you could invent one thing what would it be and why? For Shri was what is the write about an important person in your life and why.
10:30-11:30 Reading books.
11:30-12:00 Book summaries and questions.
12:00-12:30 Coloring and creative time. Shiri drew pictures of herself and Tish colored a drawing.
12:30-1:30 Lunch time.
1:30-2:30 Mom suggested 40 minutes coding for both children and 20 minutes nitro typing.
2:30-4:00 Park (went on the swings, slides, played with bubbles).
4:00-4:30 returned back to Silberman followed by early dismal.
Thursday June 21st, 2019
9:00-10:30 Journaling Time. Writing prompt for Tish if you could have any superpower what would it be and why? For Shri it was write a letter to your mom and dad.
10:30-11:30 Readings books.
11:30-12:00 Book summaries and questions.
12:00-12:30: Coloring and creative. Both children colored pictures.
12:30-1:30 Lunch time.
1:30-3:30 Picked out new books from the library and completed independent reading. Also, received free books from the librarian to take home.
3:30-4:45 Scientology exhibit museum.
4:45-5:00 Traveled back to Silberman.
5:00-5:30 Light dinner and dismissal.
Friday June 22nd, 2019
9:00-10:30 Journaling Time.
10:30-11:30 Readings books.
11:30-12:00 Book summaries and questions.
12:00-12:30: Coloring and creative. Both children colored some drawings.
12:30-1:00 Lunch time.
1:30-3:30 Park (went on the slides, swings, played with bubbles, played spelling bee contest with other children in the park, played with frisbee, and bubbles).
3:30-4:00 Returned back to Silberman and light dinner
Monday June 24th, 2019
9:00-11:00 Journaling Time. Writing prompt for Tish, if you could go back to any time in the past, which time would you return to and why? Writing prompt for Shiri, what is your favorite season and why?
11:00-12:00 Independent book reading.
1:15-2:00 Book summaries and questions.
2:00-3:00 Computer time (40 minutes coding and 20 minutes nitro typing).
4:30-5:30 Nap time.
5:30-6:00 Light dinner.
Research Triangle Park, NC (RTI International) organized by Antje Kirchner (SICSS 2017), Craig A. Hill, Alan Blatecky, Helen Jang, Jacqueline Olich, and Chloe Stephenson
RTI International (henceforth referred to as RTI) is an independent, nonprofit institute that provides research, development, and technical services to academic, governmental and commercial clients worldwide. RTI was the first non-university partner site for SICSS. As a result, RTI did not receive any direct support from the Alfred P. Sloan Foundation or the Russell Sage Foundation but financially supported SICSS-RTI site out of its own resources. RTI hosted 31 total SICSS participants, 8 of whom were RTI employees.
The SICSS-RTI post-mortem summarizes the event, which was held at RTI headquarters in Durham, NC, from June 17-28, 2019. The report is divided into five main sections: 1) outreach and application process; 2) pre-arrival; 3) first week classes; 4) second week group projects; and 5) post-departure.
Outreach and application process
Given RTI’s interest and research in computational social science, our outreach process was developed to attract both internal and external applicants, specifically recruiting a diverse group of mostly North Carolina participants since we were unable to provide any housing or travel assistance. In total, we received 36 completed external applications and 10 completed internal applications.
A Selection Committee evaluated applications based on the quality of the application materials, which included a letter of recommendation, a motivational statement detailing how the applicant would benefit from SICSS, and a Curriculum Vitae (CV). Each applicant was evaluated by at least two members of the Selection Committee who gave the application a ranking of low, medium, and high according to predetermined criteria. When two members of the selection committee ranked an applicant differently, the entire selection committee discussed the application in detail and settled on an appropriate ranking. Overall, we were pleasantly surprised by the number of highly ranked applications we received and, ultimately, we were able to accept 31 total participants to SICSS-RTI.
RTI announced participation in SICSS as a partner site in February through RTI.org. We leveraged RTI’s University Collaboration Office (UCO), led by Jacqueline Olich, to inform university contacts in North Carolina as well as other networks (e.g., at the NC chapter of ASA and NISS). After receiving only a few applicants by early April, we more specifically targeted the types of applicants we were looking for and more actively promoted SICSS at RTI. Specifically, under the “who should apply” description—graduate students, postdoctoral researchers, untenured faculty within 7 years of their Ph.D—we explicitly added “and researchers with similar qualifications” to effectively broaden the applicant pool. Specifically, we wanted to reach individuals at research institutions, like RTI, in addition to those in academia.
To more actively promote SICSS, we ran a paid LinkedIn advertisement in April 2019. We also sent reminder emails to our contacts at universities in North Carolina, pushed out the information via RTI’s Twitter feed, and leveraged the networks of RTI employees to spread the word. In retrospect, we would have benefitted from more actively promoting SICSS earlier on in the process, particularly given that we were a first-time partner site.
The application process was conducted by email (firstname.lastname@example.org) and the original application deadline was April 12, 2019. Still concerned with the number of applications we had received one week before the deadline, we extended the application deadline to Sunday April 21, 2019. We received the bulk of the applications over the weekend of April 21st and closed the application window with 36 total external applicants. It is hard to say whether extending the deadline actually resulted in significantly more applicants or whether applicants would have submitted applications closer to the deadline irrespective of the exact date. In retrospect, we should have been more sensitive to the fact that April 21 was a religious holiday for some applicants.
RTI employed a nomination process for internal applications. Division vice presidents were informed of SICSS in early March 2019 and were asked to nominate employees who would both benefit and contribute to the larger cohort by April 12, 2019. Ten RTI employees were nominated in total from our social sciences and international development divisions.
RTI sent acceptance and decline emails in early May 2019 and a Welcome Package to accepted participants in mid-May 2019. The Welcome Package included a request for a short bio and headshot (both for the website), day one arrival information, pre-arrival coding practice details, pre-arrival readings, and office hours information. Furthermore, since RTI is a government contractor, foreign nationals had to go through a background check prior to being allowed on campus. RTI did not offer accommodations to participants but did provide meals. Participants were expected to find their own lodging (if needed) and transportation to RTI for the event.
When debriefing after the event, we, the organizers, discussed that if we were to host SICSS at RTI again there would be several things we would try to communicate more clearly. First, we would make it even more explicit that participants are expected to do readings, coding exercises, and review other pre-arrival materials before SICSS. Further, we would specifically emphasize that those with less coding experience should complete all coding practices in order to be prepared to fully engages on the first day. Second, we would more strongly emphasize during the event that experienced coders should encourage the less experienced coders do the work during group activities and make it clear that, in order to be successful, participants could work on finishing group exercises in the evenings (assuming they did not finish during the allotted time). Third, not all participants were willing to share their code developed during the exercises with other participants even though we made it clear from the start that everything should be shared. Accordingly, we would make that expectation clear from the start. Lastly, in the future, we would like to better manage expectations about what the Summer Institute can and cannot provide for participants. For example, we would make it more clear that, for example, a sociologist will not walk out at the completion of SICSS, as a fully trained data scientist, or vice versa, and that in order to get the most out of SICSS the readings are vital.
Week 1: Classes
SICSS-RTI followed the same pattern as the Princeton site during the first week: lectures in the morning, lunch, group activity, feedback, and then listening to a visiting speaker either in person or through the live stream. On the first day of SICSS, we started at 8:30 am to allow time for registration and introductions before the first streamed lecture at 9:30 am. Wayne Holden, RTI’s President and CEO, welcomed the participants in person, along with Karen Davis, the head of RTI’s Research Computing Division. To help build community, we allocated time for the participants to individually introduce themselves. On all other days during the first week, we started at 9:00 am, discussed logistics for the day, and then tuned into the live stream by 9:15 am. One lesson we learned during the first couple of days of SICSS was to always keep the YouTube channel open on the screen so that we did not miss the start of the live stream. On the few occasions that we missed the start by a few minutes, we started from the beginning of the recording and informed our participants that we had a few minutes delay.
Unfortunately, on the first day of the event, Princeton experienced technical difficulties with the live stream which made it hard to see the slides from the video stream. We were able to quickly adjust and display the slides locally on one of our two projectors. We also encouraged the participants to access the slides themselves during the lecture.
We received some feedback from participants that they felt we needed more group activities and more time for group activities in the first couple of days so that participants could dive deeper into the topics covered and also get to know each other better and cultivate trust. Other university partner locations had events the Sunday night before the event because all participants were staying on campus as a group, but given that RTI did not provide accommodations, our event started Monday morning. Our social event did not happen until Wednesday evening, which was scheduled after our local speaker Sam Adams, who gave the talk Knowledge Graphs for Social Science. In the future, we would consider organizing a lunch-time activity and/or a welcome reception (especially on the first day) to allow participants to get to know one another, rather than listen to the Monday lunch-time visiting speaker.
Overall, we found Slack was a successful communications tool that allowed us to communicate with participants during lectures without interrupting the lectures. It was also used well for speaker questions. Many of our RTI participants were able to ask questions via the Slack questions channel, which the lecturers answered. In the future, we would encourage participants via Slack to revisit materials such as the code and notes in the evenings and prep for the next day.
For the group activities in the afternoons, we randomly assigned individuals to groups each day and randomly assigned the order in which we would present/discuss their results at the end of the exercise. Some participants suggested that they would prefer to work in groups with peers interested in similar research topics, but we explained that week two would provide this opportunity and asked them to embrace the diversity and to focus on learning from one another. Due to the breadth of the exercises, in combination with lectures going over at times, most groups had difficulty finishing up the exercises in the allotted time and often only presented partial results. While we clearly communicated that that was both reasonable and anticipated, this remained a constant source of frustration, which was exacerbated by some participants having not completed the pre-arrival coding exercises.
Lastly, RTI recommends that all sites provide more information regarding guest speakers well in advance. With multiple speaker options to choose from during lunch and in the afternoons, it would be helpful to have the title of the talk and a short description of the talk provided in advance so that participants can choose which talk interests them most or whether they would like to attend the lunch time social activity instead.
Week 2: Group Projects
During the second week at SICSS, participants worked on group research projects. SICSS-RTI did not convene on Saturday for morning lectures like the other sites, so on the Monday morning of the second week we watched the recorded lectures from Saturday. In the afternoon on Monday, we followed Princeton’s cluster process for group formation. Overall the process worked well especially since we encouraged individuals to think about potential projects during the first week. However, for future events, we would encourage participants to think more critically about how they choose their group and have participants who are proposing topics for a group project explain their thinking in more detail. Perhaps to encourage this, we would lengthen the discussion around possible projects or take a break during the process to encourage more thinking and side-discussion. Overall, the group formation worked well, and resulted in six groups, with one larger group consisting of eight total people and one smaller group with only two people.
We had set aside dedicated funds, similar to the other partner sites, for all participants who decided to work on their own projects and would need funds to collect their own data. These funds were made accessible to individuals using an RTI credit card (for which we made sure to have sufficient credit limits). Additionally, RTI was able to provide SICSS-RTI participants with access to two unique datasets. The first dataset included drone imagery data that could be used to explore food security in Rwanda. The second dataset was synthetic (or artificial) population data for North Carolina. On Friday afternoon during the first week, RTI provided two workshops to familiarize the participants with different machine learning techniques using both of the RTI-provided datasets. Two of the five groups chose to use the synthetic population dataset for their group project. None of the groups used the drone dataset which we believe was due to a lack of higher-level coding skills to work with. However, we could have suggested some project ideas that could be explored with the dataset to encourage engagement with the data. We did receive feedback that few of the participants felt capable to work with drone image data but found it valuable to have a workshop about image data (intro to neural networks), which was something unique that RTI provided to participants. Due to the fact that these datasets were hosted on an Azure environment with access restricted to RTI and RTI SICSS participants, we were unable to share these datasets with other SICSS partner locations, as originally planned.
On the last day of the event, participants presented their group research projects. We opened the group presentations to all of RTI and streamed them for anyone who was interested.
SICSS at RTI was a resounding success. We were able to stay within our budget and our participants found it a valuable experience. For a more detailed, personal account of SICSS, see the following blog by one of our SICSS participants: https://www.rti.org/insights/using-data-science-solve-social-science-dilemmas.
Zürich, Switzerland (ETH Zürich) organized by Elliott Ash (SICSS 2017) and Elena Labzina (SICSS 2018)
Elliott and Elena organized the summer school at the Department of Social Sciences of ETH Zürich. We had to “compete” with another summer school that the Computational Social Science Group organized around the same time. In the end, it was not a problem because we had more external applicants, while they had more internal ones.
Most of our 19 participants were Ph.D. students, but we also had several master students, a few postdocs, and one professor. In the end, this diversity worked great. Geographically, most of our participants were from Europe, but we had one person from India and several people from the USA.
The onsite events of our school consisted of the invited talks, a brief two-day course on Machine Learning, and research project-related activities.
What Went Well
From the academic perspective, everything went well. The participants enjoyed the talks and were active during their group projects.
Regarding the project work, we want to highlight two innovations that worked great:
- Participants gave their flash talks on the first two days of the school, and we asked them to form research groups by the end of the first week. This setting gave them more time to work on their projects and released some part of the uncertainty since they got information about other participants from the very beginning of the school.
- We had the best project award (which Elliott and Elena selected) and the popular award (for which the participants voted). Some competition created more incentive to make the projects better. In the end, all the projects were great. Some groups produced actual first paper drafts.
Also, we had a two-day Machine Learning course. Its goal was to familiarize participants with the basic concepts of the modern state of the computational statistics. Despite that we did not have a lot of time, the participants paid great attention to the material and asked a lot of questions. It looked like they enjoyed this brief course.
What Could be Better
Given our budget we could provide accommodation only to 10 people out of our 20 participants. This accommodation was not great once we had enough money to only book a basic hostel nearby. This is a downside of hosting this event in Zürich as it is an expensive city to visit.
Room and the AC
We were lucky to be able to book a room for the whole period of the summer school. It is a seminar room in our department that usually can comfortably accommodate 15-20 people. Unfortunately, the AC does not work well in that room, and for long 3+ hour sessions in the summer with 20 people, it felt crowded and stuffy. As a lesson, we recommend booking a slightly larger room than needed, and ensuring that a comfortable temperature can be maintained.
One of the biggest problems we came across at some point was the quality of the meals from our catering provider. Especially, it was true for vegetarian meals; on some days they turned out to be clearly unsatisfactory. In the end, we managed to sort it out, and the meals were excellent on the last days of our school. As a lesson, we understood that it makes sense to pay a lot of attention to the menu proposed by the catering company to be sure that the quality of the meals do not drop down at some point.
While we got a lot of great applicants from inside and even outside Europe, we felt that the number of applicants from the Zurich area could have been larger. We should have advertised better in our school (ETH) and another large school nearby (UZH). We still got some applicants from ETH and UZH; however, it felt like we should have placed more “visual” advertisements (like printed information sheets) all over our campus.
Conflicts with applications to other SICSS sites
We had two types of problems in this regard:
— we had an applicant who also applied to another location, and when we picked him, he first accepted it, and then turned down the offer because he decided to go to another place;
— we had several applicants who applied to us after the deadline because at that point they apparently found out that they have been turned down at the main site. We accepted some of those applicants because they were a good fit.
Due to the duplicate effort of these conflicts, one option would be to have a centralized application system with two-sided ranked matching by candidate and site, or else to ask people to apply to a single site.
We hope that this blog post described a) what we did, b) what we think worked well, and c) what we will do differently next time. We also hope that this document will be useful to other people organizing similar Summer Institutes, as well as people who are organizing partner locations for the 2020 Summer Institutes in Computational Social Science. Finally, we wish the thank the main funders of the Summer Institutes in Computational Social Science: the Russell Sage Foundation and the Alfred P. Sloan Foundation.
If you are interested in hosting a partner location of SICSS 2020 at your university, company, NGO, or governmental organization, please read our information for potential partner locations.