HIV/AIDS research collaborations, visualized

Co-authorship networks give us a sense of the strength of research collaborations. We used co-authorship data to visualize how top HIV/AIDS research institutions worked with one another, based on publications from June 2012 to September 2013. UCSF collaborations are indicated via red lines.

Visualization details: Data includes all known publications related to HIV/AIDS between June 2012 and September 2013 that includes co-authors from two or more institutions. We map each author to their institution, and the size of each institution corresponds with the number of HIV/AIDS publications its members co-authored in that time; only the most prolific institutions are shown to ensure readability of the image. The width of the lines connecting institutions corresponds to the number of publications that include co-authors from both of these institutions. Collaborations with UCSF researchers are indicated with red lines. Colors indicate clusters of institutions that often publish collaboratively, based on network modularity.

Click to view full-size image

HIV-AIDS Collaborations, June 2012 - Sep 2013

UCSF collaborations, visualized

UCSF researchers often work closely with one another, across departments. We used data from UCSF Profiles to visualize how different departments work together, based on co-authorship patterns.

Visualization details: Data is drawn from UCSF Profiles, and includes all publications co-authored by current UCSF researchers from two more departments and listed on PubMed. The size of each department corresponds with the number of publications that members have published that include partnerships with other departments. The width of the lines connecting departments corresponds to the number of publications between two departments. Colors indicate clusters of departments that often publish collaboratively, based on network modularity. No scaling is done to account for varying sizes of different departments.

Click to view full-size image

UCSF internal collaborations, by department, based on publication co-authorship

Collaboration Success Wizard — want to test to see if your geographically distributed team is poised for success?

Collaboration Success Wizard.

Check this out … looks really interesting!

Here’s the description from the site:

Once a project is approved to participate [to use the Wizard], we send invitation e-mails to all the project members. The Wizard is an online survey that takes about 30 minutes. Each individual involved in the project should take the survey independently. The more project members who take the survey, the better the data!

And yes – it’s free!

At the end of the survey each participant can see a personalized individual report that contains feedback based on their answers and our research. This report is available immediately, and summarizes both the strong points and the issues at risk for the target collaboration.

The “first follower” is as important as the leader

 I was forwarded a great video from Opinder Bawa (UCSF’s CTO) today —  here’s the video and Opinder’s lead in. 

Kevin Grumbach mentioned in conversation the wonderful 5 minute video from a Ted talk about how being a “first follower” is as important as being a leader.

 Do check out the link below and I think you will find it as relevant as I did for what we are trying to do with team research, community engagement, health professional education, health care teamwork and the like.

Notes from the Science of Team Science Conference at Northwestern University

The event was packed with theories about the motors and challenges of team science and some interesting initiatives and tools. One of the highlights for us was certainly the introduction to UCINET, a social network analysis tool for team science, which might be useful for the further impact analysis of ShareCenter and crowdsourcing tools like our Open Forum. John Skvoretz from the University of South Florida walked us through the basic methods of social network analysis for team science. Using this program we could get better insight into whether these tools help researchers from different disciplines to connect, and whether most users make new connections or connect with people they already know offline which revives the question whether distance is dead. The program can handle a maximum of 32,767 nodes and includes centrality measures, subgroup identification, role analysis, and more. Now, the challenge is to get the data!

Bonnie Spring from Northwestern University presented COALESCE, a CTSA Online Assistance for Leveraging the Science of Collaborative Effort, that will “create, evaluate, and disseminate new, durable, readily accessible on-line learning resources to enhance essential skills needed to perform transdisciplinary, team-based, basic and clinical translational research”. Four learning modules will be developed over the next two years for the “Science of Team Science,” “Team Science Research Process in Basic Science and in Clinical Science”, as well as “Team Science in Behavioral Medicine.” The Team Science module, for example, will introduce the key concepts of team science by showcasing successful national transdisciplinary NIH research programs and interviews with prominent team science experts.

In the spirit of a web portal for collaboration, we learned about a couple of tools that help researchers manage and evaluate collaborations. 1) Gary Olson from UC Irvine talked about a “collaboration success wizard”, a web-based tool to help researchers assess the prospective success of a collaborative project before it starts. The tool is expected to be available July this year. 2) Howard Gadlin, who runs – as he puts it – an emergency room for team science at the NIH Center for Cooperative Resolution, gave a fabulous presentation talking from the other end of the telescope, about the “dark side” of collaboration. He introduced us to a collaborative agreement, a “pre-nuptial agreement” for scientists, to help scientific collaborators commence their project by anticipating, discussing, and resolving possible areas of disagreement. Using the pre-nup, the parties can jointly define a process for constructively handling disputes should they arise in the future. And 3) the National Cancer Center will launch a “team science toolkit” shortly that intends to provide an online hub for team science-related resources and communication.

William Trochim from Cornell University introduced us to concept mapping, a mixed methods participatory approach that combines group processes (brainstorming, sorting, group interpretation) with a sequence of multivariate statistical analyses (multidimensional scaling, hierarchical cluster analysis) – maybe something to explore in the light of our upcoming survey and research projects. See his paper about “Concept Mapping as an Alternative Approach for the Analysis of Open-Ended Survey Responses”.

Katy Boerner  from Indiana University spoke about the “Cyberinfrastructures for Network Science”. She presented a couple of tools, such as 1) the Network Workbench, a large-scale network analysis, modeling and visualization, which purportedly supports network science research across scientific boundaries, and 2) the Scholarly Database (SDB) that focuses on supporting large studies of changes in science over time and communicating findings via knowledge-domain visualizations. The database currently provides access to around 18 million publications, patents, and grants. In the future, Boerner said, she wants to leverage the power of network analysis to understand better what delays and inhibits science. The tools are available at http://sci.slis.indiana.edu/registration/user/.

More resources from the SciTS Conference are available at http://scienceofteamscience.northwestern.edu/team-science-resources

Further reading:

Written by: Rachael Sak, Leslie Yuan and Katja Reuter

Science 2.0

It is exactly what you think it is.  The term was brought up in todays demo by Mendeley, which has a product similar to EndNote but with some crowd-sourcing capabilities to categorize content.  You can google the term yourself of course, but here is a good introductory article on “Science 2.0”: http://www.scientificamerican.com/article.cfm?id=science-2-point-0-great-new-tool-or-great-risk

Another Sharing Environment – World Wide Science

This is a global science gateway connecting, which some of you may know. Currently, 40 national and international scientific databases and portals from more than 50 countries are searchable. New features seem to  make it easier to share search results through social-networking sites, cluster results to let users quickly narrow results lists, author, topic or date, and provide improved relevance ranking.