...
Code Block | ||
---|---|---|
| ||
{ "sql-credentials": [ { "host": "sql-server-abc", "username": "my_user", "password": "my_password" }, { "host": "sql-other-server-def", "username": {"from_file": "/config/the-username"}, "password": {"from_env": "THE_ENV_VAR_WHERE_MY_PASSWORD_IS"} } ], "s3-credentials": { "storedCredentials": [ { "path": "", "authenticationType": "BASIC", "accessKeyId" : "my_key", "secretAccessKey" : "my_secret" } ], "eksServiceAccountCredentials": [ { "path": "s3.us-east-2.amazonaws.com", "roleArnEnvVariable" : "TEST_ROLE_ARN", "webIdentityTokenFileEnvVariable" : "TEST_TOKEN_FILE", "sessionPrefix" : "testprefixABC" } ] }, "azure-credentials": { "storedCredentials": [ { "account": "my_account", "authenticationType": "SHARED_KEY", "path": "", "sharedKey": "mysharedkey==" } ] } } |
The ‘sql-credentials’ section allows username and password credentials to be associated with specific sql database hosts. Credentials can be hardcoded directly in the config file, located in a file, or located in an environment variable.
...
Open the ModelOp Runtime to which you want to add endpoints
...
2. Select the “Create Endpoint” button for either the input or output endpoints
...
3. Select Click the button on the right to add either an Input endpoint , or Output endpoint. Choose either REST or Kafka:
...
REST Endpoints
Add a Name, optional Description, Encoding type, and optional port. Select “Save Endpoint”
...
Kafka endpoints require several fields, as detailed in the screenshot and table below.
...
Field | Type | Description | Example |
---|---|---|---|
Name |
| A user-supplied name for the endpoint | “MyKafkaEndpoint1” |
Description |
| A user-supplied description for the endpoint | “Kafka Endpoint for consumer transactions” |
Encoding |
| The encoding used for serialization/deserialization of the kafka messages | json |
BootstrapServers |
| The name and port of the Kafka bootstrap servers. | "kafka1:9002" |
Group (Consumer Group) (optional) |
| To provide a horizontally scalable solution, a Consumer Group can be used to allow load balancing of requests across multiple ModelOp runtimes. By defining a Consumer Group, ModelOp Center ensures that only one ModelOp runtime in the Consumer Group services a given message. | "fastscore-1" |
Principal (optional) |
| An authenticated user in a secure cluster | "kafka/kafka@REALM" |
Topic |
| The Kafka topic to consume or push to, depending on if input vs. output endpoint | "MyKafkaTopic" |
Keyfile (optional) |
| The location of the keytab file containing pairs of Kerberos principals and encrypted keys | "/fastscore.keytab" |
...