Skip to content

Latest commit

 

History

History

Cloud Healthcare API Python Samples

https://gstatic.com/cloudssh/images/open-btn.png

This directory contains samples for Cloud Healthcare API. Cloud Healthcare API implements healthcare-native protocols and formats to accelerate ingestion, storage, analysis, and integration of healthcare data with cloud-based applications. - See the migration guide for information about migrating to Python client library v0.25.1.

To run the sample, you need to enable the API at: https://console.cloud.google.com/apis/library/healthcare.googleapis.com

To run the sample, you need to have the following roles: * Healthcare Dataset Administrator * Healthcare DICOM Store Administrator * Healthcare DICOM Editor * Healthcare DICOM Viewer

Setup

Authentication

This sample requires you to have authentication setup. Refer to the Authentication Getting Started Guide for instructions on setting up credentials for applications.

Install Dependencies

  1. Clone python-docs-samples and change directory to the sample directory you want to use.

    $ git clone https://github.com/GoogleCloudPlatform/python-docs-samples.git
  2. Install pip and virtualenv if you do not already have them. You may want to refer to the Python Development Environment Setup Guide for Google Cloud Platform for instructions.

  3. Create a virtualenv. Samples are compatible with Python 2.7 and 3.4+.

    $ virtualenv env
    $ source env/bin/activate
  4. Install the dependencies needed to run the samples.

    $ pip install -r requirements.txt

Samples

DICOM stores

https://gstatic.com/cloudssh/images/open-btn.png

To run this sample:

$ python dicom_stores.py

usage: dicom_stores.py [-h] [--project_id PROJECT_ID] [--location LOCATION]
                       [--dataset_id DATASET_ID]
                       [--dicom_store_id DICOM_STORE_ID]
                       [--pubsub_topic PUBSUB_TOPIC] [--uri_prefix URI_PREFIX]
                       [--content_uri CONTENT_URI]
                       [--export_format {FORMAT_UNSPECIFIED,DICOM,JSON_BIGQUERY_IMPORT}]
                       [--member MEMBER] [--role ROLE]
                       {create-dicom-store,delete-dicom-store,get-dicom-store,list-dicom-stores,patch-dicom-store,get_iam_policy,set_iam_policy,export-dicom-store,import-dicom-store}
                       ...

positional arguments:
  {create-dicom-store,delete-dicom-store,get-dicom-store,list-dicom-stores,patch-dicom-store,get_iam_policy,set_iam_policy,export-dicom-store,import-dicom-store}
    create-dicom-store  Creates a new DICOM store within the parent dataset.
    delete-dicom-store  Deletes the specified DICOM store.
    get-dicom-store     Gets the specified DICOM store.
    list-dicom-stores   Lists the DICOM stores in the given dataset.
    patch-dicom-store   Updates the DICOM store.
    get_iam_policy      Gets the IAM policy for the specified DICOM store.
    set_iam_policy      Sets the IAM policy for the specified DICOM store. A
                        single member will be assigned a single role. A member
                        can be any of: - allUsers, that is, anyone -
                        allAuthenticatedUsers, anyone authenticated with a
                        Google account - user:email, as in
                        'user:somebody@example.com' - group:email, as in
                        'group:admins@example.com' - domain:domainname, as in
                        'domain:example.com' - serviceAccount:email, as in
                        'serviceAccount:my-other-
                        app@appspot.gserviceaccount.com' A role can be any IAM
                        role, such as 'roles/viewer', 'roles/owner', or
                        'roles/editor'
    export-dicom-store  Export data to a Google Cloud Storage bucket by
                        copying it from the DICOM store.
    import-dicom-store  Imports data into the DICOM store by copying it from
                        the specified source.

optional arguments:
  -h, --help            show this help message and exit
  --project_id PROJECT_ID
                        GCP project name
  --location LOCATION   GCP location
  --dataset_id DATASET_ID
                        Name of dataset
  --dicom_store_id DICOM_STORE_ID
                        Name of DICOM store
  --pubsub_topic PUBSUB_TOPIC
                        The Cloud Pub/Sub topic that notifications of changes
                        are published on
  --uri_prefix URI_PREFIX
                        URI for a Google Cloud Storage directory to which
                        result files should be written (e.g., "bucket-
                        id/path/to/destination/dir").
  --content_uri CONTENT_URI
                        URI for a Google Cloud Storage directory from which
                        files should be imported (e.g., "bucket-
                        id/path/to/destination/dir").
  --export_format {FORMAT_UNSPECIFIED,DICOM,JSON_BIGQUERY_IMPORT}
                        Specifies the output format. If the format is
                        unspecified, the default functionality is to export to
                        DICOM.
  --member MEMBER       Member to add to IAM policy (e.g.
                        "domain:example.com")
  --role ROLE           IAM Role to give to member (e.g. "roles/viewer")

DICOMweb

https://gstatic.com/cloudssh/images/open-btn.png

To run this sample:

$ python dicomweb.py

usage: dicomweb.py [-h] [--project_id PROJECT_ID] [--location LOCATION]
                   [--dataset_id DATASET_ID] [--dicom_store_id DICOM_STORE_ID]
                   [--dcm_file DCM_FILE] [--study_uid STUDY_UID]
                   [--series_uid SERIES_UID] [--instance_uid INSTANCE_UID]
                   {dicomweb-store-instance,dicomweb-search-instance,dicomweb-retrieve-study,dicomweb-search-studies,dicomweb-retrieve-instance,dicomweb-retrieve-rendered,dicomweb-delete-study}
                   ...

positional arguments:
  {dicomweb-store-instance,dicomweb-search-instance,dicomweb-retrieve-study,dicomweb-search-studies,dicomweb-retrieve-instance,dicomweb-retrieve-rendered,dicomweb-delete-study}
    dicomweb-store-instance
                        Handles the POST requests specified in the DICOMweb
                        standard.
    dicomweb-search-instance
                        Handles the GET requests specified in DICOMweb
                        standard.
    dicomweb-retrieve-study
                        Handles the GET requests specified in the DICOMweb
                        standard.
    dicomweb-search-studies
                        Handles the GET requests specified in the DICOMweb
                        standard.
    dicomweb-retrieve-instance
                        Handles the GET requests specified in the DICOMweb
                        standard.
    dicomweb-retrieve-rendered
                        Handles the GET requests specified in the DICOMweb
                        standard.
    dicomweb-delete-study
                        Handles DELETE requests equivalent to the GET requests
                        specified in the WADO-RS standard.

optional arguments:
  -h, --help            show this help message and exit
  --project_id PROJECT_ID
                        GCP project name
  --location LOCATION   GCP location
  --dataset_id DATASET_ID
                        Name of dataset
  --dicom_store_id DICOM_STORE_ID
                        Name of DICOM store
  --dcm_file DCM_FILE   File name for DCM file to store.
  --study_uid STUDY_UID
                        Unique identifier for a study.
  --series_uid SERIES_UID
                        Unique identifier for a series.
  --instance_uid INSTANCE_UID
                        Unique identifier for an instance.

The client library

This sample uses the Google Cloud Client Library for Python. You can read the documentation for more details on API usage and use GitHub to browse the source and report issues.