Getting Started

Prerequisites

Cloud environment

The devices are connected to a cloud environment:

  • Production is the main shared infrastructure used for deploying applications into operation for multiple clients.

  • Integration is the main shared infrastructure used for testing development changes such as new features or bugfixes with realistic deployment settings.

In the Production environment, deployments are carried out once a month with a usual 2 week advance notice on the Munic.io Dashboard. Production comes with a standard SLA, monitoring and on-call duty.

However, in the Integration environment, there are no restrictions on deployment and there is no on-call duty.

Please note that these environments don’t communicate with each other. They don’t share the same data. i.e: A user’s token cannot be used from one environment to another.

Production vs Other Platforms

Different Platforms exists, you should exclusively use the shared Production platform unless your account manager at Munic tells you otherwise.

OAuth protocol

Munic.Connect is an Identity Provider that unifies end-user access to all applications they subscribe to from Munic.io ecosystem.

Munic.Connect uses the OAuth protocol. With it, users can authorize your application to access their data.

You can create your user account here: Munic.Connect.

To fetch the token, you can run this sample command - adapt it for your program:

curl 'https://connect.munic.io/oauth/token' \
--header 'Content-Type: application/json' \
--header 'Accept: application/json' \
--data-raw '{
    "grant_type": "password",
    "username": "john.doe@example.com",
    "password": "123456789"
}'

The response for such query is:

{
    "access_token": "eyJhbGciOiJSUzI1NiIsInByb3ZpZGVyIjoibXVuaWMiLCJ0eXAiOiJKV1QifQ.eyJleHAiOjE2OTU5OTcyNDUsInNjb3BlIjoicHVibGljIiwidXNlciI6eyJ1aWQiOiI5ZWYwNzRmMy03OTUyLTQ1YTMtYmFiMS1kOWI5MWIxNzQ2ZmEifX0.bI-tSI69mdeCjBLuSHH3Xe5JAWXFt0fdaE4cGsq4fcoYVP52gGBrBiukUjqxFoFqxrIhm0qd9lIkJFDJerW_SunQwcHy3x0KvDmWnJel0ZGK86V5px3HEjglF2EAswUd79gMCzSvII8iHeIDWjkEj0WRcYwvrAh5dhNODLnlgPwzdNUqHDEIjjMC_rOucxjlD2PrXQBwo547saoYIPJDUseqtGkqmKzoA1AZSOpHlDyXY6yuI6S7fqGFkys2ZzjfVDRg7Nrwf9-S4_ApBj9_ZN9uC-aFgk3ZEzQ_eo-YtlivN8TCwmw_eG7q-5XHqQOXiFychNNGnh9NEmn4hK2ong",
    "token_type": "Bearer",
    "expires_in": 7200,
    "refresh_token": "cLKyBTSh3yeWef6d2S82MJNQpxe-92ubb9KrkA",
    "scope": "public",
    "created_at": 1695997245
}

Where:

  • access_token is a JWT (JSON Web Token). You can read the content of the token in jwt.io. This token must be used to request the GraphQL as a bearer token See here for more details. From now on whenever we use the word munic connect token we are refering to this access_token.
  • expires_in is the duration of the token which is set to 7200 seconds (2 hours). Once the token expires, you should refresh it either using the query mentioned beforehand or by using refresh_token as follows:
curl --location 'https://connect.munic.io/oauth/token' \
--header 'Content-Type: application/json' \
--data '{
    "grant_type": "refresh_token",
    "refresh_token": "cLKyBTSh3yeWef6d2S82MJNQpxe-92ubb9KrkA"
}'
Production vs Other Platforms

Please note that if you are using other platforms than Production, you need to register and request your token from another munic connect. For instance:

GraphQL

Before starting to develop with ekko, you need to learn the GraphQL language - QL stands for Query Language like in SQL.

Here is the global documentation

Advantages of using GraphQL:

GraphQL is a query language and runtime for APIs. Unlike REST, which exposes a fixed set of endpoints for each resource, GraphQL exposes a single endpoint and allows clients to request exactly what they need. This reduces over-fetching and under-fetching of data.

Key Points:

  1. Single Endpoint: All GraphQL requests are directed at a single endpoint.
  2. Flexible: Clients specify what data they need, leading to efficient and specific data retrieval.
  3. Strongly Typed: GraphQL schema defines the types and relationships between types.

Munic also has created its own GraphQL documentation to explain its schema:

Alternative documentation whenever you need to use Integration:

GraphQL Query

GraphQL queries are a way to request specific data from the server. Unlike traditional REST APIs where you access a predefined endpoint and get a fixed set of data, with GraphQL, you describe in your query exactly what information you want, and the server responds accordingly.

Key Points:

  1. Selective Data Retrieval: Ask for what you need, get exactly that.
  2. Strongly Typed: Queries must adhere to the schema defined on the server.
  3. Nested Data: You can fetch related data in a single request, reducing the need for multiple API calls.

A typical GraphQL query has the following structure:

{
  field1(arg: "value") {
    subfield1
    subfield2
  }
  field2
}

Where:

  • field1 and field2 are the main data fields you want to retrieve.
  • arg is an argument you’re passing to field1 to influence its returned data.
  • subfield1 and subfield2 are nested fields within field1.

To fetch data using GraphQL queries in Python, the gql library is a convenient choice, coupled with a suitable transport (like HTTP).

Then write your python code:

from os import environ
from gql import gql, Client
from gql.transport.requests import RequestsHTTPTransport

# Define the GraphQL query
QUERY = gql("""
    query {
        account(id: "me") {
            full_name
            email
        }
    }
""")

# Set up the HTTP transport layer
transport = RequestsHTTPTransport(
    url="https://api.munic.io/services/ekko/v2/graphql",
    # Replace MUNIC_AUTH environ or set it up directly
    headers={"Authorization": environ["MUNIC_AUTH"]}
)

# Create a GraphQL client using the transport
client = Client(
    transport=transport,
    fetch_schema_from_transport=True,
)

# Execute the query
result = client.execute(QUERY)

# Print the result
print(result)
  1. Import necessary modules: We’re using the gql library for constructing and executing GraphQL queries, and the RequestsHTTPTransport for sending these queries over HTTP.
  2. Define your query: This is a sample GraphQL query. Replace it with your own.
  3. Setting up the HTTP transport: The RequestsHTTPTransport class sets up the HTTP connection to the GraphQL server endpoint, with a valid bearer token.
  4. Create a GraphQL client: The Client class manages the GraphQL session. Ensure the transport is set to the previously defined HTTP transport. The fetch_schema_from_transport=True argument fetches the GraphQL schema on creating the client.
  5. Execute the query: The client.execute() method sends the query and waits for a response, which is then printed to the console.

Best practices and tips:

  • Always validate your GraphQL queries against the server’s schema before executing them.
  • Make use of GraphQL variables instead of string interpolation for dynamic query values. It’s safer and more efficient.
  • Handle potential exceptions, such as network errors or data fetching errors, to ensure your application is robust.

GraphQL Mutation

While GraphQL queries fetch data from a server, mutations modify data. Think of mutations as equivalent to the POST, PUT, PATCH, and DELETE methods in REST APIs. In GraphQL, all these modifications are bundled under mutations.

Key Points:

  1. Data Modification: Mutations change data on the server.
  2. Strongly Typed: Mutations must adhere to the schema defined on the server.
  3. Response Retrieval: Just like queries, mutations allow you to ask for the data you’d like to see in the response, offering feedback about the operation.

A typical GraphQL mutation has the following structure:

mutation {
  modifyData(input: { field1: "value1", field2: "value2" }) {
    returnedField1
    returnedField2
  }
}

Where:

  • modifyData is the mutation you want to perform.
  • input is the data you’re passing to the mutation.
  • returnedField1 and returnedField2 are fields from the modified data you want to retrieve in the response.

To modify data using GraphQL mutations in Python, we’ll again use the gql library, coupled with a suitable transport (like HTTP).

Then write your python code:

from os import environ
from gql import gql, Client
from gql.transport.requests import RequestsHTTPTransport

# Define the GraphQL mutation
MUTATION = gql("""
    mutation {
        account_update(full_name: "My new name") {
            full_name
        }
    }
""")

# Set up the HTTP transport layer
transport = RequestsHTTPTransport(
    url="https://api.munic.io/services/ekko/v2/graphql",
    headers={"Authorization": environ["MUNIC_AUTH"]}
)

# Create a GraphQL client using the transport
client = Client(
    transport=transport,
    fetch_schema_from_transport=True,
)

# Execute the mutation
result = client.execute(MUTATION)

# Print the result
print(result)
  1. Import necessary modules: We’re using the gql library for constructing and executing GraphQL mutations, and the RequestsHTTPTransport for sending these mutations over HTTP.
  2. Define your mutation: This is a sample GraphQL mutation. Replace it with your own.
  3. Setting up the HTTP transport: The RequestsHTTPTransport class sets up the HTTP connection to the GraphQL server endpoint, with a valid bearer token.
  4. Create a GraphQL client: The Client class manages the GraphQL session. Ensure the transport is set to the previously defined HTTP transport. The fetch_schema_from_transport=True argument fetches the GraphQL schema on creating the client.
  5. Execute the mutation: The client.execute() method sends the mutation and waits for a response, which is then printed to the console.

Best Practices and Tips:

  • Always validate your GraphQL mutations against the server’s schema before executing them.
  • Make use of GraphQL variables for dynamic mutation values, ensuring type safety and cleaner code.
  • Handle potential exceptions, such as network errors or data modification errors, to ensure your application is robust.

GraphQL Subscription

Subscriptions are a GraphQL feature that allows a server to send data to its clients when a specific event happens. This is beneficial for real-time applications where data changes frequently.

Key Points:

  1. Real-time Data: Immediate feedback without repeatedly polling the server.
  2. Event-Driven: Server sends data based on specific events or triggers.
  3. Persistent Connection: Utilizes WebSockets to maintain a consistent connection between client and server.

Subscriptions are based on websocket and fail to have a consensus standard. We decided to implement Phoenix websocket standard and to use Absinthe framework that also has a specific implementation for subscriptions.

Refer to the mutation section of the GraphQL documentation to check which events can be listened to.

To use GraphQL subscriptions in Elixir with Phoenix WebSockets and Absinthe, you will need to:

  1. Create a Phoenix WebSocket channel for your subscription.
  2. Create an Absinthe subscription resolver for your subscription.
  3. Subscribe to the subscription from your client.

In python, this would be:

  1. Import the websockets package.
  2. Create a WebSocket connection to the GraphQL server.
  3. Send a subscription request to the server.
  4. Receive updates from the server.
from os import environ
import asyncio
import websockets
import json

# Base websocket url
API_ENDPOINT = "wss://api.munic.io/services/ekko/v2/"
# JWT token with "Bearer " prefix
API_AUTH=environ['MUNIC_AUTH']

# Phoenix channel join payload
SUBSCRIPTION_PAYLOAD = {
    "topic": "__absinthe__:control",
    "event": "phx_join",
    "payload": {},
    "ref": "1"
}

# Graphql subscription query
QUERY = """
subscription {
    deviceNotifications(deviceIds: ["359......."]) {
      deviceId
        ... on Track {
            fields {
                name,
                coordinates { lat, lng, time }
            }
        }
    }
}
"""

# GraphQL subscription payload for Absinthe
GRAPHQL_SUBSCRIPTION_MESSAGE = {
    "topic": "__absinthe__:control",
    "event": "doc",
    "payload": {
        "query": QUERY,
        "variables": {}  # Add any variables if needed
    },
    "ref": "2"
}


async def handle_subscription():
    url = f"{API_ENDPOINT}socket/websocket?Authorization={API_AUTH}".replace(' ', '+')
    async with websockets.connect(url) as ws:

        # Join the GraphQL topic for Absinthe
        await ws.send(json.dumps(SUBSCRIPTION_PAYLOAD))
        response = await ws.recv()
        print(f"Join response: {response}")

        # Send the GraphQL subscription tailored for Absinthe
        await ws.send(json.dumps(GRAPHQL_SUBSCRIPTION_MESSAGE))

        while True:
            message = await ws.recv()
            message_data = json.loads(message)

            # Handle the Phoenix heartbeat (ping/pong mechanism)
            if message_data["event"] == "phx_ping":
                print("pong", message_data["ref"])
                await ws.send(json.dumps({
                    "topic": "phoenix",
                    "event": "phx_pong",
                    "payload": {},
                    "ref": message_data["ref"]
                }))
            else:
                # Handle incoming GraphQL data here
                print(message_data)


# Run the asynchronous function
asyncio.get_event_loop().run_until_complete(handle_subscription())

In this version, we kept the structure of the Phoenix protocol but adapted the GraphQL message payload to what Absinthe usually expects. This mainly involves adding the “variables” key for potential variables you’d pass with your GraphQL subscription (even if it’s empty).

Finally, if your Absinthe setup has any custom middleware or additional parameters that affect the WebSocket communication, you’d need to incorporate them here. Always consult the documentation or the server-side configuration for any specifics that might affect client-server communication.

Key points to consider:

  1. Phoenix Message Structure: Phoenix expects messages to be in a specific structure with topic, event, payload, and ref.
  2. Joining the Channel: Before sending any data, you should join the appropriate Phoenix channel. The response from this will tell you if you’ve successfully joined.
  3. Heartbeat Handling: Phoenix channels have a heartbeat mechanism with “phx_ping” and “phx_pong” events. This keeps the connection alive. When the server sends a “phx_ping”, you should reply with a “phx_pong” to acknowledge.
  4. GraphQL Subscription: Once you’ve joined the channel, you can send your GraphQL subscription. Responses from the server related to your subscription will be received in the while True loop.

Make sure to handle exceptions, adapt the topic names according to your Phoenix server configuration, and manage unique reference numbers (ref) for distinct requests.

Hints:

  • You can run several subscribtion at the same time
  • Always close unused WebSocket connections to free up resources
  • Use connection heartbeat mechanisms to ensure the link stays alive
  • Ensure your GraphQL subscriptions are optimized, avoiding n+1 query problems

HTTP

GraphQL is used over HTTP protocol. So, it is important to have a basic HTTP knowledge.

Here is a link of HTTP documentation

Sometimes, when doing GraphQL requests, an error may appear. And it is not necessarily related to the request itself but to the HTTP server. That’s why it is also mandatory to understand the error codes of HTTP.

Here is a list of the different status codes

Tools

Based on your environment, you can open the online GraphQl editor:

Most queries require an authenticated user. You should provide your auth token in the Authorization section.