RNS SEO: How 52 research networking sites perform on Google, and what that tells us

Research networking systems (RNS) like Vivo, Profiles, SciVal, and Pure are meant to be used — but often fail to be discoverable by real users because of poor search engine optimization (SEO).

That’s why we’re releasing RNS SEO 2015, the first-ever report describing how RNS performs in terms of real-world discoverability on Google.

1. Methodology

We picked 52 different RNS that matched the following criteria:

  • associated with a single institutions (excludes trade groups, collaborations, etc.)
  • based in a majority English-language locale (exclude France, Germany, etc.)
  • meant to be accessible to the public (excludes systems behind a firewall, not on port 80, etc.)

Here’s how we did it:

  1. For every single system, get a list of every single profiled user and the associated URL (e.g. “Kevin Labar” at https://scholars.duke.edu/display/per0978462)
  2. For every system, select 500 people at random (under 500 users in a system? select them all)
  3. For each name, search Google for “First Last Institution” (e.g. “Kevin Labar duke”)
  4. See if the RNS profile page shows up among the top 3 search results

We looked at search rankings for 24,583 profile pages across 52 different RNS sites, using the RankTank keyword rank checker tool to do bulk checking.

2. The Results

We’re measuring search engine optimization success by looking at what percent of a site’s profiles appear within the top 3 Google results for “First Last Institution”

  1. University of Melbourne 95% [VIVO] [under official domain]
  2. University of Massachusetts 91% [Profiles] [under official domain]
  3. University of California, San Francisco 87.8% [Profiles] [under official domain]
  4. Cornell 80.6% [VIVO] [under official domain]
  5. University of Colorado Denver 76% [Profiles] [under official domain]
  6. Boston University 73.8% [Profiles] [under official domain]
  7. Harvard University 71.8% [Profiles] [under official domain]
  8. Georgia Regents University 66.8% [Pure]
  9. University of Minnesota 64.2% [SciVal Experts] [under official domain]
  10. Penn State 62.6% [Profiles] [under official domain]
  11. Michigan State University 59.2% [SciVal Experts] [under official domain]
  12. Northwestern 55% [VIVO] [under official domain]
  13. Northwestern University 54.6% [SciVal Experts] [under official domain]
  14. Ohio State University 54.4% [Pure]
  15. University of South Africa 45% [SciVal Experts]
  16. Clinical Translational Science Institute at Children’s National 43.7% [SciVal Experts]
  17. University of California, San Diego 42.4% [Profiles] [under official domain]
  18. University of Colorado Boulder 42.2% [VIVO] [under official domain]
  19. Stanford University 38.4% [Custom] [under official domain]
  20. Wake Forest 38.4% [Profiles] [under official domain]
  21. University of Maryland-Baltimore 37% [Pure]
  22. University of Iowa 35% [Custom] [under official domain]
  23. University of Southern California 34.6% [Profiles]
  24. University of Illinois – Chicago 29.6% [SciVal Experts]
  25. University of Hawai‘i 26.9% [VIVO]
  26. Case Western Reserve University 23.4% [SciVal Experts]
  27. Thomas Jefferson University 23.4% [Profiles] [under official domain]
  28. Johns Hopkins University 22.2% [SciVal Experts]
  29. Scripps Research Institute 20.4% [VIVO] [under official domain]
  30. University of Nevada, Las Vegas 20% [VIVO]
  31. Wayne State University 19.2% [SciVal Experts]
  32. Washington State University 18.2% [SciVal Experts]
  33. University of Pennsylvania 17% [VIVO] [under official domain]
  34. University of Montana 15.6% [VIVO]
  35. Western Michigan University 15.4% [SciVal Experts]
  36. Duke University 15.2% [VIVO] [under official domain]
  37. University of Rochester 14.6% [Profiles] [under official domain]
  38. University of Florida 14.2% [VIVO] [under official domain]
  39. Montana State University 12.9% [VIVO]
  40. University of Nebraska 12.8% [SciVal Experts]
  41. University of Nevada, Reno 12.3% [VIVO]
  42. Temple University 11% [Pure]
  43. Northern Arizona University 10.8% [SciVal Experts]
  44. University of Arizona 10% [SciVal Experts]
  45. Arizona State University 9.2% [SciVal Experts]
  46. University of California, Davis 8.8% [SciVal Experts]
  47. University of Miami 7.4% [SciVal Experts]
  48. Albert Einstein College of Medicine 4% [SciVal Experts]
  49. University of Utah 2% [SciVal Experts]
  50. Texas A&M 1.2% [VIVO] [under official domain]
  51. Indiana University 0.6% [SciVal Experts]
  52. Oregon Health & Science University 0.6% [SciVal Experts]

The top sites have incredible search engine optimization — fourteen sites have scores of over 50%, and five with over 75%. On the flip side, twenty-three of the sites have scores of 20% or below, leaving lots of room for improvement.

3. Lessons Learned

Which software should you pick? It’s possible to succeed with any platform — through Profiles does particularly well, owning 6 of the top 10 spots.

  • Profiles average score = 56%
  • Elsevier Pure average score = 42%
  • Custom software average score = 37%
  • Vivo average score = 31%
  • SciVal Experts average score = 22%

But this is complicated by the fact that systems that are hosted on a domain name other than that of their parent institution perform much worse:

  • Official domain? (e.g. vivo.cornell.edu)
    average score = 49%
  • Other domain? (e.g. experts.scival.com/asu)
    average score = 21%

Putting that together:

  • SciVal Experts + official domain average score = 59%
  • Profiles + official domain   average score = 58%
  • Elsevier Pure + other domain average score = 42%
  • VIVO + official domain average score = 38%
  • Custom + official domain average score = 37%
  • Profiles + other domain average score = 35%
  • VIVO other domain average score = 18%
  • SciVal Experts + other domain average score = 14%

While the n for these are small, it gives us a sense of which combinations may work best. I was particularly disappointed to see that VIVO sites don’t score very well, on average, compared to other platforms.

The final lesson learned is how important it is to get a high volume and diversity of incoming links to help increase search rankings for a given site, something made clear by the Moz Search Engine Ranking Factors.

The top 3 sites have a huge diversity of incoming links:

  1. findanexpert.unimelb.edu.au has 488 linking root domains
  2. profiles.umassmed.edu has 249 linking root domains
  3. profiles.ucsf.edu has 858 linking root domains

The correlation between linking root domains and search rankings holds true across our dataset:

linking root domains

# linking root domains via Moz Open Site Explorer. Data excludes experts.scival.com.


4. The five steps to increase your rankings

Already have a site running on an institutional domain? Here’s how to move ahead:

1. Be worthy of being linked to:

  • Most people care about people, not generic information-finding site
  • Make profile pages beautiful and chock-full of information, so people will want to link to them

2. Establish benchmarks:

  • Install Google Analytics (or equivalent) on every page
  • Learn how to use it — we also recommend Web Analytics 2.0 by Avinash Kaushik

3. Get good with Google:

  • Add a sitemap.xml (sitemaps.org)
  • Register on Google Webmaster Tools to:
    • register your sitemap
    • catch indexing errors early
    • link to your Google Analytics account

4. Make sure your pages will look their best on search engines:

anatomy of a search result


5. Kickstart the process of getting linked to:

  • Get campus sites to link to your homepageas a trusted campus resource
  • Get campus sites to link to individual profiles from departmental profiles, news stories, directory, etc.
  • Encourage reuse of your data via APIs, and ask for a link back as attribution

If that works:

  • Researchers will start linking to their profile pages on their own sites
  • Blogs and social media will start linking to your profile pages as authoritative sources
  • Departments may start linking to your profiles if your data is more current than theirs

Teenage Migraine Researcher Uses Mobile Technology to Enhance Study

A new clinical trial for adolescent migraine is underway, and it’s harnessing the power of consumer technology to collect better data and make study participation easier. The BRAiN-M Study, which is examining whether melatonin (a natural supplement) is effective in preventing teenage migraine, uses Fitbit devices and an online “headache diary” to collect data from study participants remotely.

Besides trying to figure out how to prevent teenage migraine, the study’s lead investigator, Dr. Amy Gelfand of UCSF, is looking to make pediatric migraine clinical trials more inclusive and accessible.

Gelfand says that participation in conventional pediatric migraine clinical trials is often too difficult for families to orchestrate. Teenage Migraine StudyMost require many in-person visits, which results in missed school for the kids and missed work for their parents. This problem is exacerbated for families who do not live near the study clinic because of the long travel distances.

By leveraging technology, the BRAiN-M Study is able to remove these barriers to participation. How big of a deal is this? Huge! There was a pediatric migraine study conducted in 2013 (Powers et al JAMA) that highlighted a startling statistic: 44% of those that declined to participate in the study cited either “distance too far” or “did not have time to participate” as their reason for declining. That’s a pretty significant number.

“I wanted to design a study that worked for families with hectic schedules and kept children from missing school. After all, we’re treating their migraine problem in part so that they can go to school more, not less! By conducting most of this study remotely using web, mobile and wearable technologies, we are able to lessen the burden of study participation,” says Gelfand.

The 7 Keys to Maximizing Email Survey Response Rates

Lessons learned after achieving a high email survey response rate for a recent NSF Grant Study on UCSF Profiles.  Brought to you by Anirvan Chatterjee & Nooshin Latour

Your recipients don’t care about your email

The average office worker may get over 100 emails per day. Swiftly deleting or ignoring unwanted email can be the only way to stay afloat. These seven best practices will help ensure your email gets opened, read, and acted on — and not ignored or deleted.

We believe that our email marketing tactics and using customized data to drive up survey responses is widely applicable across research studies that can utilize targeted user data to increase study participation.

1. Don’t use Outlook — here’s what to use instead

We send normal email from our personal email accounts (e.g. Outlook), but when every response matters, it’s critical to use an email service provider like ExactTarget or MailChimp instead. These cheap or free services allow you to:

  1. measure how many sent emails have been delivered
  2. measure how many delivered emails have been opened
  3. measure how many opened emails have been acted upon (had links clicked on)

Email service providers also let you compose email that looks good on smartphones, tablets, and computers, and helps ensure your mail doesn’t get flagged as spam.

2. You live or die by your subject line

Most recipients will decide to open or delete/ignore your email based on a quick glance at the subject line. How can you write a subject that compels a user to open the email?1

In his article “The Three Key Elements of Irresistible Email Subject Lines,” Brian Clark recommends the “Four U” approach to writing headlines:

  1. Useful: Is the promised message valuable to the reader?
  2. Ultra-specific: Does the reader know what’s being promised?
  3. Unique: Is the promised message compelling and remarkable?
  4. Urgent: Does the reader feel the need to read now?

Mailchimp gives examples of several effective subject lines (70-93% open rates):

  • Preliminary Floor Plans for Southern Village Neighborhood Circle Members
  • MotorCycling Magazine Reader Survey
  • Inside Football: Summer Training Camp Preview Issue

…and several bad subject lines (0-12% open rates):

  • Final reminder for complimentary entry to attend the West Freelands BCI Cluster Conference 2006
  • Help Baylor create the ideal college experience
  • Don’t Let 2006 Slip Away Without a Tax Deductible Donation To the Children & Families of Omire

Not sure which subject line to use? You can do an A/B test to see — see #7!

3. Can the sender be trusted?

Which email are you more likely to open?

  • “Important Update” from “UCSF Chancellor <ucsfchancellor@ucsf.edu>”
  • “Important Update” from “Cheap Online Dealz <joe92934@hotmail.com>”

If you’re sending email from an email service provider like ExactTarget or MailChimp, you can easily change your email’s “From:” line to whichever name elicits the most interest or trust among recipients. (Send email only from addresses that you have permission to use.)

Not sure which email address to send from? You can do an A/B test to see — see #7!

4. Was your email opened?

If your subject and From: lines were effective, recipients are more likely to open your email. Email service providers (step #1) measure your email’s open rate — a lower bound for the proportion of recipients who actually opened and read your email.

Higher e-mail open rates are associated with higher survey response rates.

MailChimp offers a list of open and click rates for email sent from across a wide range of industries, including:

  • Health and Fitness Industry
    • 24% open the emails, 3.6% click a link
  • Medical, Dental, and Healthcare Industry
    • 23% open the emails, 3.1% click a link

5. Get to the point with concise, scannable text

Imagine your recipient reading your email while standing in the checkout line — busy, distracted, and reading on a small screen. How do you ensure your email doesn’t get ignored or deleted?2

  1. Put the important ideas first
    can readers identify the main message and what you want them to do after reading the subject and first several sentences?
  2. The shorter your email, the more likely it will be read
    keep in mind that smartphones show much less text than Outlook / webmail screens
  3. Skip the jargon, and write like a human
    write more like the way you’d speak to a neighbor or family member
  4. Paragraphs and bullets help with readability
    breaking text into simple chunks makes it easier to understand and scan
  5. Visually emphasize what’s important
    use bolding judiciously to emphasize the most critical phrases in your email
  6. Deemphasize the boilerplate
    put boilerplate as far down as possible, so your main message comes first

6. What’s your call to action?

Your email has a tactical purpose, e.g. getting users to click a link to a survey. That’s your call to action.

  1. Make your call to action incredibly obvious
    use returns, bolding, color, buttons, etc. to make your call to action stand out
  2. Repeat your call to action several times
    reader may not see it the first time, so offer many opportunities to do the right thing
  3. Eliminate distractions
    g. if your email is full of extraneous links, those compete with your main message
  4. Consider the reader’s motivations and incentives
    why should busy readers take action? are they helping researchers understand a disease? fulfilling a prior commitment? eligible for a prize?

A clear call to action is critical. Here are three different internal UCSF administrative emails we sent — each with similar open rates, but very different click/action rates:

  • Survey of 2,300 UCSF researchers about their industry contacts:
    • 41+% opened the email, 31% completed survey
    • we had short text, strong incentives, and sent a reminder email to users who didn’t complete the survey the first time around
  • Informational email to UCSF Profiles owners about their profile pages
    • 46+% opened the email, 14% clicked a link to see their profile pages
    • link is prominent, but users may not have had a strong incentive to click it
  • Email from UCTV to UCSF Profiles users about videos being added to their profiles
    • 39+% opened the email, 13% clicked a link
    • main link is to individual’s UCSF Profiles page, secondary link to login page

7. Test before you send

  • Proofread the email before you send
    Have a colleague read for accuracy, and someone outside the field read for tone
  • A/B test to optimize your emails
    Send alternate subject lines to a subset of users, and use the most effective one (platforms like ExactTarget and MailChimp make this incredibly easy)
  • Don’t forget to proofread on a smartphone
  • Test your links


  1. Baseline industry email performance averages (http://mailchimp.com/resources/research/email-marketing-benchmarks) for comparison: 18% open rate; 3-6% click through rates, 3% conversion rate (http://www.marketingprofs.com/charts/2013/11664/e-commerce-benchmarks-email-referral-traffic-and-conversion-rates-drop-in-q2).
  2. Slide deck – Email strategy by CTSI to Increase Engagement of UCSF Profiles:

For additional information or questions, contact Anirvan Chatterjee, Data Strategy at CTSI, (anirvan.chatterjee@ucsf.edu) or Nooshin Latour, Communications & Marketing, (nooshin.latour@ucsf.edu ).

Anatomy of the Winning NSF Industry Survey Email below

UCSF Profiles Team Invited to Geneva, Switzerland

The UCSF Profiles Team got more international attention for its enhancements to the Profiles product and the level of engaged users last year. Over the past several months, the Special Program for Research and Training in Tropical Diseases (TDR) has been in talks with UCSF Profiles to gain insight and plan an approach to create a system that will show and track their researchers’ work around the globe. TDR is a global collaborative program sponsored by the United Nations Children’s Fund (UNICEF), the United Nations Development Program (UNDP), the World Bank and World Health Organization (WHO).

at WHO

Eric Meeks, left, and Brian Turner at the World Health Organization headquarters in Geneva, Switzerland to discuss Profiles research networking software.

At the TDR’s invitation, Eric Meeks, Chief Technology Officer, and Brian Turner, Product Director, both from UCSF CTSI, traveled to Geneva to meet with the TDR team. We discussed the capabilities of Profiles in depth, the work we’ve done over time to get the traffic and user engagement that we have, and the system-level requirements for their research networking/tracking system. We also reviewed their plans for the soon-to-be-released request for proposal (RFP) and provided feedback on technical RFPs and projects in general. We met for the majority of a day, then had a chance to see a bit of Geneva and enjoy some Swiss/French cuisine before heading home.

Next steps will be to review and possibly respond to the TDR’s forthcoming RFP as a potential service provider to host their Profiles instance.

We’ve completed our NSF Grant! UCSF Profiles and its use by external partners


UCSF Profiles is an example of a Research networking system (RNS). These systems provide automated aggregation and mining of information to create profiles and networks of the people that make up an academic institution. RNS’s have in effect, become a new kind of ‘front door’ for the university, providing access to the university’s intellectual capital in a manner previously unattainable — i.e. one focused on expertise rather than schools or departments, thus intermingling experts regardless of where they’re officially housed. Against this backdrop, we wanted to understand how such a tool might enhance access to academic expertise by external partners, specifically industry, and improve UCSF’s response to industry interest.


To this end, we assessed the usage of UCSF Profiles by commercial entities in the biotech, medical device and pharmaceutical industries to understand both how the tool might be used to enable industry-academic interactions in general, and then get a snapshot for UCSF of the nature of industry interest in our faculty.

We systematically derived a list of 111 unique biomedical-related companies with identifiable IP addresses who viewed individual faculty profiles. In one year, biomedical companies viewed 2,618 UCSF profiles (between July 1, 2013 and June 30, 2014). Profiles were viewed one or more times by one or more users by one or more companies on that list. By Sept 2014, 2318 individuals were still at UCSF representing roughly 35% of all profiles on UCSF Profiles as of September 2014.

We found that researchers were viewed across the spectrum of seniority, with slight increases by seniority from postdocs and residents to assistant professors, associate and full professors. Professors accounted for 53% of the pageviews from companies, and 64% (790 of 1244) Professors were viewed at least once by at least one company during the year. Although Professors were most viewed, a significant number of more junior assistant professors (39%, 381 of 972) and postdocs (31%, 326 of 1055) were viewed. Again, in terms of depth of interest, clearly individual professors got more pageviews on average than junior researchers (professors averaged 6.34 pageviews, associate professors 4.17, assistant professors 3.38 and postdocs 1.39)

1  2

We then sent a short email survey to all those that were viewed by industry as defined above in the past year. The survey assessed the following:

  • whether the individual viewed had a prior relationship to the company or not, to establish whether the tool was being used to view potential new collaborators for industry.
  • why they thought they might have been viewed by industry partners, to get an initial sense of areas of potential interest for industry viewing academic profiles online.

Of 2,304 faculty and trainees who actually received the email survey (no bounces), 718 responded, a 31% response rate. Of those who responded, 237 (33%) had a prior relationship with the company who contacted them. Thus, the majority of views (481 of 718, or 67%) were for researchers that had no prior relationship with the company.

We also asked if the researcher was contacted by the company that viewed them. We found that 230 (33%, the similarity in numbers is a coincidence) were contacted by 1 or more companies. Professors who were viewed had a higher chance of being contacted (27% of viewed professors were contacted) than those more junior (24, 17 and 11% respectively of Associate, Assistant and Postdocs, respectively).

Since our goal was to enable support of faculty to prepare them for industry interest and/or to enhance the chance that a meaningful relationship develop, we were most interested in those without prior relationships. Of the 481 that had no prior relationship, 83 (17%) of these were contacted. Though we do not know how the contacts went, we are working on a process for the industry alliances office to get reports based on these data on a regular basis so they can follow up individually.


Finally we analyzed user reports describing their sense of why industry would have been interested in viewing their profile and contacting them. This information provides guidance on elements of user profiles that can be enhanced in the future to improve engagement with industry partners as well as provides insights for follow-up from the relevant institutional office. We categorized responses under one of six buckets:

  1. interest in research collaboration
  2. interest in specific technology
  3. recruiting
  4. sales or other commercial interest
  5. don’t know
  6. other

Not surprisingly most faculty thought the industry interest was based on their research (308 out of 716, 43%), and mostly, they thought, via a publication. Some but only a few thought it arose from their own prior collaboration with industry (63 out of 716, 8%) and a few specifically suggested a specialized technology from their lab could be of interest (22 out of 716, 3%). Many did not know why industry had been interested in their profile (128 out of 716, 17%) providing a key group to help and support in understanding the commercial implications and potential health impact of their work.

A key goal for this project was to enable UCSF to improve how we support the formation of industry-academic collaborations. We worked closely with the institutional offices that manage these relations and want to improve how they identify those that may need targeted help. We discussed tools & approaches and we are working to establish a regular process for reporting to enable this improvement.

Examples include:

  • Providing input on emerging scientists with research of value to industry. The junior faculty that did not have prior industry relations are an especially key subgroup that would otherwise not rise to the attention of industry alliances offices.
  • An overlay of those viewed by industry with length of time at UCSF could provide a shortlist of those who may have activities of interest for industry collaboration and be less likely to have information about efforts at the university that can facilitate those interactions
  • The companies found viewing faculty profiles can be compared to those who are establishing contracts to understand and potentially engage companies that show interest but haven’t converted into specific alliances.
  • Programs such as the Early Translational Research program can use regular reports based on the analyses we modeled to send targeted solicitation of research proposals that may require support to advance (often when the faculty member themselves may not be aware of this).
  • Programs can use identified faculty who are of interest to industry for focus groups or other forums to further customize programs.
  • Detailed company-specific data can be generated to enable the industry alliances office to build more effective partnerships.

Want more details?  Let us know!  We have lots more data that we can share if your interest is piqued. Send inquiries to profiles@ucsf.edu

SEO for Research Networking: How to boost Profiles/VIVO traffic by an order of magnitude

"Redwoods" by Michael Balint (cc-by)

The UCSF Profiles team has increased site usage by over an order of magnitude since the site’s big campus-wide launch in 2010. This “growth hacking” cheat sheet distills the key lessons learned during that period, and can be applied to almost any research networking platform, including VIVO, Profiles, and home-grown solutions.

1. Measure Everything

  • Install Google Analytics
    • Set it up on every page of the site
  • Learn how to use it
  • Segment on-campus vs. off-campus use
    • Find your “service provider” name(s) at Audience > Technology > Network
    • Create an advanced segment that includes only your service provider(s), and one that excludes it/them
    • Use these two segments to analyze everything (internal and external visitors are totally different, and need to always be analyzed separately)
  • Register with Google Webmaster Tools
    • Go to google.com/webmasters/tools
    • Follow the directions to register your site
    • See how your site’s indexed on Google, and check for issues
  • Check the Recommendations for RNS Usage Tracking

2. Ignore Your Homepage, Focus on Profile Pages

  • On a mature search-optimized RNS like UCSF Profiles, only 2.6% of visits start on the homepage
  • If you’re successful with steps 3-4, traffic directly to profile pages will skyrocket, and dominate traffic. That means you need to focus most of your attention on the care, feeding, and design of profile pages, vs. the home page.

3. Search Engine Optimization (SEO)

  • Make sure search engines can see your pages
    • Tweak your robots.txt so search engines can see all your pages (robotstxt.org)
    • Create a dynamically-generated sitemap of all your profile pages (sitemaps.org)
    • Mention your sitemap in your robots.txt file, and then register it with Google Webmaster Tools
    • Wait a day, use Google Webmaster Tools to validate that your sitemap works
  • Improve the copy on your profile page titles and descriptions
    • Make the page <title> on profile pages short and globally unique
    • Make <meta name=”description”> on profile pages readable and descriptive
      (e.g. “Jane Doe’s profile, publications, research topics, and co-authors”)

4. Add extra professional metadata

  • Follow the directions at schema.org and schema.org/Person to add people-oriented HTML metadata to your profile pages
  • Use google.com/webmasters/tools/richsnippets to test your syntax
  • OPTIONAL: Use “pretty” URLs — and include names if possible (e.g. http://www.yoursite.edu/firstname.lastname)
    • Pretty URLs should be the “real” final URL, not just a redirect
    • All old or alternative profile URLs should do a 301 redirect to the pretty URL
  • OPTIONAL: Prevent indexing of multiple versions of your page
    • If you have multiple versions of your page getting indexed (e.g. /url/ vs. /url/?a=b), tell search engines which version is the main one by using the rel=canonical canonical link element

5. Get Inbound Links

  • Get webmasters to link to your homepage from campus resource guides, etc.
  • Get webmasters to link to individual profiles from departmental faculty profiles, news stories, campus directory, etc.
  • Encourage reuse of your data via APIs, and ask for a link back as attribution (downstream users save time and money; you get links back in return)
  • All these new links may not send traffic, but will help SEO.

Have questions? Suggestions? Leave a comment below, or contact Anirvan Chatterjee directly.

Photo credit: Michael Balint, used under Creative Commons attribution license

UCSF Profiles coauthorship networks, by degree

We’re using UCSF Profiles data to explore whether co-authorship networks are a good way to show the connections between researchers at UCSF.

We can start off by looking at immediate co-authorship connections. I was surprised at how few current UCSF co-authors most users have. The flip side of co-authoring widely outside of one’s institution is that there are fewer internal co-authors:

Avg # contacts, 1 degree away

The numbers jump when you go one degree further out, though the relative proportions are similar:

1 and 2 degree

The numbers grow further when we count first, second, and third degree coauthors.

1-3 degree contacts

My big takeaways are unsurprising:

  • The number of UCSF co-authorships generally grow with seniority — which may correlate with both the length of one’s career, as well as one’s tenure at UCSF
  • Even in the case of professors’ 1st-3rd degree, connections, we’re maxing out at 180 people, out of about 6,500 people in UCSF Profiles. This number may correlate to the size of one’s department/field at UCSF.
  • If we showed logged-in UCSF Profiles users a visualization showing “here’s how you know this person” when looking at another random user’s profile, it would kick in pretty infrequently — thought that might be different for folks in the same field/department

Get every new post delivered to your Inbox.

Join 1,295 other followers

%d bloggers like this: