stormlibpp.httpcore module

Implements some methods from synapse.cortex.Cortex but with HTTP.

This is useful when replicating Telepath based tools that need to be used with HTTP. For example, the hstorm CLI replaces a Cortex object with an HttpCortex so it can run Storm commands over HTTP.

class stormlibpp.httpcore.HttpCortex(url: str = 'https://localhost:4443', usr: str = '', pwd: str = '', token: str = '', default_opts: dict = {'repr': True}, ssl_verify: bool = True)

Bases: object

A class with some methods from synapse.cortex.Cortex but over HTTP.

For now, it only supports the storm and callStorm methods. These methods take the same arguments and return the same types of values as their Cortex equivalents.

Communicating with Synapse over HTTP requires a user on the Cortex that has a password set. HttpCortex needs to authenticate to the Cortex before making requests (i.e. using any of this objects methods). Because of this, HttpCortex implements a login method, and the object constructor expects a username and password.

HttpCortex relies on an aiohttp.ClientSession underneath to make HTTP requests. This session needs to be closed to avoid errors at the end of program execution. HttpCortex exposes a close method that must be called when this object is no longer needed.

HttpCortex is an async context manager. It calls the login and close methods for you upon entrance and exit of the object.

Examples:

# Use HttpCortex as an async context manager so login/cleanup is handled
async with HttpCortex("<HTTP URL>", "<username>", "<password>") as hcore:
    async for msg in hcore.storm("[inet:ipv4=1.1.1.1]"):
        if msg[0] == "node":
            pprint.pprint(msg[1])

# Or handle login and object cleanup yourself
hcore = HttpCortex("<HTTP URL>", "<username>", "<password>")
await hcore.login()
async for msg in hcore.storm("[inet:ipv4=1.1.1.1]"):
    print(msg)
    # Do other things with each "msg"
await hcore.close()

# callStorm can be used to get a single value instead of streaming results
async with HttpCortex(...) as hcore:
    retn = await hcore.callStorm("$var = 'some val' return($var)")
    if retn["status"]:
        print(retn["result"])

Parameters

urlstr, optional

The URL of the Synapse Cortex to connect to, by default “https://localhost:4443”.

usrstr, optional

The username to authenticate with, by default “”.

pwdstr, optional

The password to authenticate with, by default “”.

tokenstr, optional

A token to authenticate with, instead of usr/pwd, by default “”.

default_optsdict, optional

The default Storm options to pass with every request made by this instance. Set this to an empty dict to disable. By default {“repr”: True}.

ssl_verifybool, optional

Whether to verify the Cortex’s SSL certificate, by default True.

async addFeedData(name: str | None, items: list[tuple[tuple[str, str], stormlibpp.node.NodeVals]], *, viewiden: str | None = None)

Feed node tuples to the Cortex.

Parameters

namestr | None

An “optional” name to give the import

itemslist[NodeTuple]

A list of NodeTuples that will be imported into the Cortex.

viewidenstr | None, optional

A specific view to import data to, the default view is used if None, by default None.

Returns

dict

The return from the /api/v1/feed endpoint.

Raises

HttpCortexError

If an exception is raised when making an HTTP request to the Cortex. This will likely either be from an HTTP error, a connection error, or an error decoding the JSON response.

async callStorm(text: str, opts: dict | None = None)

Execute a Storm query and return the value passed to a Storm return() call.

Parameters

textstr

The Storm code to execute.

optsdict | None, optional

Storm options to use when executing this Storm code, by default None.

Returns

dict

The response from the Cortex’s HTTP API. It contains 2 keys:

result
status

Raises

HttpCortexError

If an exception is raised when making an HTTP request to the Cortex. This will likely either be from an HTTP error, a connection error, or an error decoding the JSON response.

async exportStorm(text: str, opts: dict | None = None) AsyncGenerator[bytes, None]

Export packed nodes returned by a given query using /api/v1/storm/export.

Parameters

textstr

A Storm query that returns nodes to export.

optsdict | None, optional

The Storm options to use for the export - see synapse.tools.storm.ExportCmd for some export specific options to use, by default None.

Yields

bytes

The packed nodes returned by the query. Yielded in chunks for large sets of nodes.

Raises

HttpCortexError

If an exception is raised when making an HTTP request to the Cortex. This will likely either be from an HTTP error, a connection error, or an error reading the response.

async getAxonBytes(*args, **kwargs)

Not implemented - here to support an HTTP Storm CLI.

async getAxonUpload(*args, **kwargs)

Not implemented - here to support an HTTP Storm CLI.

async login(close=False)

Login to the Cortex with the user/pass (or API key) supplied at instantiation.

If a user/pass is supplied, cookie based authentication is used.

Sets the cookie returned by the Cortex in the underlying ClientSession. Ignores the expiration date because there was errors adding the SimpleCookie to the session’s CookieJar. Instead we use the raw cookie value, without options set by the server.

Cortex cookies expire after 2 weeks. So this object shouldn’t live longer than that without calling this method again.

If an API key is passed at instantiation, the user/pass combo is ignored and the ClientSession is configured to send the API key value in the X-API-KEY header with every request.

Parameters

closebool

Close the underlying session when logins fail, by default False.

async stop()

Stop this instance by closing its HTTP session.

async storm(text: str, opts: dict | None = None, tuplify: bool = True) AsyncGenerator[tuple[str, dict], None]

Evaulate a Storm query and yield the streamed Storm messages.

Parameters

textstr

The Storm code to execute.

optsdict | None, optional

Storm options to use when executing this Storm code, by default None.

tuplifybool, optional

Whether to pass streamed Storm messages to the synapse.lib.msgpack.deepcopy function for conversion to the “packed tuple” format that most Cortex methods use. This results in a slight performance hit but it plays nice with all of the existing Synapse code, most importantly the CLI. Setting this option to False, will remove the performance concerns but it may break other tooling that are expecting “packed tuple” input values. synapse.common.tuplify was another candidate over deepcopy, but deepcopy is faster. By default True.

Yields

StormMsg

Each message streamed by the Cortex.

Raises

HttpCortexError

If an exception is raised when making an HTTP request to the Cortex. This will likely either be from an HTTP error, a connection error, or an error decoding the JSON response.

async stormlist(text: str, opts: dict | None = None, tuplify: bool = True) list[tuple[str, dict]]
stormlibpp.httpcore.StormMsg

A message yielded by a Cortex storm call.

See Storm Message Types.

alias of tuple[str, dict]

stormlibpp.httpcore.StormMsgType

The type of a Storm message.

See Storm Message Types.