~alanorth/dspace-statistics-api

A simple REST API to expose Solr view and download statistics for items in a DSpace repository.
0c8fb21f — Alan Orth 9 months ago
README.md: Update DSpace wiki URLs
b359c246 — Alan Orth 9 months ago
.travis.yml: Don't build in a container
0eaed3e8 — Alan Orth 9 months ago
.travis.yml: Use Python 3.8-dev instead of master

clone

read-only
https://git.sr.ht/~alanorth/dspace-statistics-api
read/write
git@git.sr.ht:~alanorth/dspace-statistics-api

You can also use your local clone with git send-email.

#DSpace Statistics API Build Status builds.sr.ht status

DSpace stores item view and download events in a Solr "statistics" core. This information is available for use in the various DSpace user interfaces, but is not exposed externally via any APIs. The DSpace 4/5/6 REST API, for example, only exposes information about communities, collections, item metadata, and bitstreams.

This project contains an indexer and a Falcon-based web application to make the statistics available via a simple REST API. You can read more about the Solr queries used to gather the item view and download statistics on the DSpace wiki.

If you use the DSpace Statistics API please cite:

Orth, A. 2018. DSpace statistics API. Nairobi, Kenya: ILRI. https://hdl.handle.net/10568/99143.

#Requirements

#Installation

Create a Python virtual environment and install the dependencies:

$ python3 -m venv venv
$ source venv/bin/activate
$ pip install -r requirements.txt

#Running

Set up the environment variables for Solr and PostgreSQL:

$ export SOLR_SERVER=http://localhost:8080/solr
$ export DATABASE_NAME=dspacestatistics
$ export DATABASE_USER=dspacestatistics
$ export DATABASE_PASS=dspacestatistics
$ export DATABASE_HOST=localhost

Index the Solr statistics core to populate the PostgreSQL database:

$ python -m dspace_statistics_api.indexer

Run the REST API:

$ gunicorn dspace_statistics_api.app

Test to see if there are any statistics:

$ curl 'http://localhost:8000/items?limit=1'

#Testing

Install development packages using pip:

$ pip install -r requirements-dev.txt

Run tests:

$ pytest

#Deployment

There are example systemd service and timer units in the contrib directory. The API service listens on localhost by default so you will need to expose it publicly using a web server like nginx.

An example nginx configuration is:

server {
    #...

    location ~ /rest/statistics/?(.*) {
        access_log /var/log/nginx/statistics.log;
        proxy_pass http://statistics_api/$1$is_args$args;
    }
}

upstream statistics_api {
    server 127.0.0.1:5000;
}

This would expose the API at /rest/statistics.

#Using the API

The API exposes the following endpoints:

  • GET / — return a basic API documentation page.
  • GET /items — return views and downloads for all items that Solr knows about¹. Accepts limit and page query parameters for pagination of results (limit must be an integer between 1 and 100, and page must be an integer greater than or equal to 0).
  • GET /item/id — return views and downloads for a single item (id must be a UUID). Returns HTTP 404 if an item id is not found.

The item id is the internal uuid for an item. You can get these from the standard DSpace REST API.

¹ We are querying the Solr statistics core, which technically only knows about items that have either views or downloads. If an item is not present here you can assume it has zero views and zero downloads, but not necessarily that it does not exist in the repository.

#Todo

  • Better logging
  • Version API
  • Use JSON in PostgreSQL
  • Add top items endpoint, perhaps /top/items or /items/top?
  • Make community and collection stats available
  • Check IDs in database to see if they are deleted...

#License

This work is licensed under the GPLv3.

The license allows you to use and modify the work for personal and commercial purposes, but if you distribute the work you must provide users with a means to access the source code for the version you are distributing. Read more about the GPLv3 at TL;DR Legal.