Deploying SQL EDGE

Gibin Francis
6 min readJun 22, 2021

--

Microsoft released a new edge version of the SQL which is named as SQL Edge. Now lets see how we can deploy and configure the same for our IoT application.

Prerequisites

You need to have azure portal access with IoT hub and IoT Edge device. you can follow the below link to set up IoT edge device
https://docs.microsoft.com/en-us/azure/iot-edge/how-to-install-iot-edge?view=iotedge-2020-11

Creating Module Deployment

On this example we will be deploying Azure SQL Edge — Developer from the market place and also temperature simulator from Microsoft to simulate some values

You can search on the market place for “Azure SQL Edge -Developer” then update the password in the configuration to use it.

Or you can use the below deployment manifest also

Azure SQL Edge Developer
we can proceed with below manifest, you need to change the password

Temperature Simulator
We can use the below Temperature sensor from Microsoft to simulate some sample events.

Routing
Once we have both the modules ready we need to add one more thing in the deployment manifest, Routing. we should pipe the message from temerature sensor to azure sql edge module like below

Here we are using the name “dbinput” as an input for the Azure SQL Edge. We can use anu name for this. Bute need to note the same for connecting the the same with SQL Server in future step

please find the full manifest below.

{"modulesContent": {"$edgeAgent": {"properties.desired": {"modules": {"AzureSQLEdge": {"settings": {"image": "mcr.microsoft.com/azure-sql-edge/developer","createOptions": "{\"HostConfig\":{\"CapAdd\":[\"SYS_PTRACE\"],\"Binds\":[\"sqlvolume:/sqlvolume\"],\"PortBindings\":{\"1433/tcp\":[{\"HostPort\":\"1433\"}]},\"Mounts\":[{\"Type\":\"volume\",\"Source\":\"sqlvolume\",\"Target\":\"/var/opt/mssql\"}]},\"Env\":[\"MSSQL_AGENT_ENABLED=TRUE\",\"ClientTransportType=AMQP_TCP_Only\",\"PlanId=asde-developer-on-iot-edge\"]}"},"type": "docker","version": "1.0","env": {"ACCEPT_EULA": {"value": "Y"},"MSSQL_SA_PASSWORD": {"value": "passworD12345"},"MSSQL_LCID": {"value": "1033"},"MSSQL_COLLATION": {"value": "SQL_Latin1_General_CP1_CI_AS"}},"status": "running","restartPolicy": "always"},"tempsimulator": {"settings": {"image": "mcr.microsoft.com/azureiotedge-simulated-temperature-sensor:1.0","createOptions": ""},"type": "docker","version": "1.0","status": "running","restartPolicy": "always"}},"runtime": {"settings": {"minDockerVersion": "v1.25"},"type": "docker"},"schemaVersion": "1.0","systemModules": {"edgeAgent": {"settings": {"image": "mcr.microsoft.com/azureiotedge-agent:1.1","createOptions": "{\"HostConfig\":{\"Binds\":[\"/etc/edgedata/edgeagent:/cache\"],\"LogConfig\":{\"Type\":\"json-file\",\"Config\":{\"max-size\":\"10m\",\"max-file\":\"3\"}}}}"},"type": "docker"},"edgeHub": {"settings": {"image": "mcr.microsoft.com/azureiotedge-hub:1.1","createOptions": "{\"HostConfig\":{\"Binds\":[\"/etc/edgedata/edgehub:/cache\"],\"PortBindings\":{\"443/tcp\":[{\"HostPort\":\"443\"}],\"5671/tcp\":[{\"HostPort\":\"5671\"}],\"8883/tcp\":[{\"HostPort\":\"8883\"}]}}}"},"type": "docker","status": "running","restartPolicy": "always"}}}},"$edgeHub": {"properties.desired": {"routes": {"sqledgeroute": "FROM /messages/modules/tempsimulator/outputs/temperatureOutput INTO BrokeredEndpoint(\"/modules/AzureSQLEdge/inputs/dbinput\")"},"schemaVersion": "1.1","storeAndForwardConfiguration": {"timeToLiveSecs": 1296000}}}}}

Now we have the edge device with Temperature Simulator and Azure SQL Edge. To test the SQL Edge Deployment, we can connect to the SQL EDGE by using below commands

Use the command to connect the Azure SQL Edge Docker

sudo docker exec -it AzureSQLEdge “bash”

Now we are in the bash command of Azure SQL Edge Docker

/opt/mssql-tools/bin/sqlcmd -S localhost -U SA -P “<your_password>”

now you in the SQL Command window, you can try the SQL Commands on the same.

SELECT Name from sys.Databases
GO

Once you check the connectivity with the SQL Edge, you are good to set up the streaming job to receive messages.

Now lets do some steps to create the streaming and required settings in the SQL Edge. You can use Azure Data Studio to log into the server and run the commands below

Create Database

CREATE DATABASE TestDB
ON
(NAME = TestDB, FILENAME = '/var/opt/mssql/testdb.mdf')
GO

Create Table

CREATE TABLE TestDB.dbo.SensorReadings (timeCreated VARCHAR(255) null, temperature FLOAT null, pressure FLOAT null)
GO

Create Master Key
We need this master key to be available to create scoped password in the future steps

CREATE MASTER KEY ENCRYPTION BY PASSWORD = ‘your_strong_password’;

Create JSON Format
This will create a new custom JSON format on the SQL Server. This will be an external file format for SQL Server. For more configurations, you can check below link

If NOT Exists (select name from sys.external_file_formats where name = ‘JSONFormat’)
Begin
CREATE EXTERNAL FILE FORMAT [JSONFormat]
WITH ( FORMAT_TYPE = JSON)
End

you can also specify configurations like compression while creating the external file format

Create external data source with Edgehub
Now we should create an External data source of type edgehub. For more configurations, you can visit the link

If NOT Exists (select name from sys.external_data_sources where name = ‘EdgeHub’)
Begin
Create EXTERNAL DATA SOURCE [EdgeHub]
With(
LOCATION = N’edgehub://’
)
End

Create External Stream
Now we will create a new external stream specifying the Edgehub as Data source and JSON Format as file format. Here the “location” will be the input topic which we specified in the routing section of deployment manifest.

If NOT Exists (select name from sys.external_streams where name = ‘sqldbinput’)
Begin
CREATE EXTERNAL STREAM sqldbinput WITH
(
DATA_SOURCE = EdgeHub,
FILE_FORMAT = JSONFormat,
LOCATION = N’dbinput’
)
End

Create Scoped credentials
Now we need to create a scoped password. Which will be used to connect to Table in our DB using the streaming process. Here you need to replace the password with your SQL Edge password

Declare @SQL_SA_Password varchar(200) = ‘passworD12345’
declare @query varchar(max)
If NOT Exists (select name from sys.database_scoped_credentials where name = ‘SQLCredential’)
Begin
set @query = ‘CREATE DATABASE SCOPED CREDENTIAL SQLCredential
WITH IDENTITY = ‘’sa’’, SECRET = ‘’’ + @SQL_SA_Password + ‘’’’
Execute(@query)
End

Create external data source with SQL
Now we are creating a new data source with SQL edge. here we are using the scoped credentials from above step

If NOT Exists (select name from sys.external_data_sources where name = ‘LocalSQLOutput’)
Begin
CREATE EXTERNAL DATA SOURCE LocalSQLOutput WITH (
LOCATION = ‘sqlserver://tcp:.,1433’,CREDENTIAL = SQLCredential)
End

Create external stream with SQL
Now we can create an external stream with the above created SQL data source

If NOT Exists (select name from sys.external_streams where name = ‘sqldboutput’)
Begin
CREATE EXTERNAL STREAM sqldboutput WITH
(
DATA_SOURCE = LocalSQLOutput,
LOCATION = N’TestDB.dbo.SensorReadings’
)
End

Create Stream Job
Now lets create streaming job. This will use the same engine from Azure Stream Analytics. So we can use the same query patterns except the UDF and external reference tables in this method

EXEC sys.sp_create_streaming_job @name=N’sqlasaprocess’,
@statement= N’Select i.timeCreated AS [timeCreated], i.machine.temperature AS [temperature], i.machine.pressure AS [pressure] INTO sqldboutput from sqldbinput i’

here we are mapping the names corresponding to the column names in the table

Start streaming job
Now are ready to start the streaming job. Once we run the command, streaming job will start.

exec sys.sp_start_streaming_job @name=N’sqlasaprocess’

Now you can see the data on you SQL Edge.

Clean Up SQL Resources

EXEC sys.sp_stop_streaming_job @name=N'sqlasaprocess'
EXEC sys.sp_drop_streaming_job @name=N'sqlasaprocess'
DROP EXTERNAL STREAM sqldbinput
DROP EXTERNAL STREAM sqldboutput
DROP EXTERNAL DATA SOURCE LocalSQLOutput
DROP EXTERNAL DATA SOURCE EdgeHub
DROP TABLE TestDB.dbo.SensorReadings
DROP DATABASE SCOPED CREDENTIAL SQLCredential
DROP MASTER KEY

Some useful tables and queries

Get streaming job status

exec sys.sp_get_streaming_job @name=N'sqlasaprocess'
WITH RESULT SETS
(
(
name nvarchar(256),
status nvarchar(256),
error nvarchar(256)
)
)

Get Active job details

Select
sj.Name as Job_Name,
sj.Create_date as Job_Create_date,
sj.modify_date as Job_Modify_date,
sj.statement as Stream_Job_Query,
Input_Stream_Name =
Case js.is_input
when 1 then s.Name
else null
END,
output_Stream_Name =
case js.is_output
when 1 then s.Name
else null
END,
s.location as Stream_Location
from sys.external_job_streams js
inner join sys.external_streams s on s.object_id = js.stream_id
inner join sys.external_streaming_jobs sj on sj.object_id = js.job_id
select
count(*) as [count], sum(inputs) as inputs, sum(outputs) as outputs, sum(linked_to_job)
as linked_to_job, data_source_type
from (
select isnull(value,’unknown’) as data_source_type, inputs, outputs, linked_to_job
from
(
select
convert(sysname, lower(substring(ds.location, 0, charindex(‘://’, ds.location))), 1) as data_source_type,
isnull(inputs, 0) as inputs, isnull(outputs, 0) as outputs, isnull(js.stream_id/js.stream_id, 0) as linked_to_job
from sys.external_streams es
join sys.external_data_sources ds
on es.data_source_id = ds.data_source_id
left join
(
select stream_id, max(cast(is_input as int)) inputs, max(cast(is_output as int)) outputs
from sys.external_job_streams group by stream_id
) js
on js.stream_id = es.object_id
) ds
left join
(
select value from string_split(‘edgehub,sqlserver,kafka’, ‘,’)) as known_ep on data_source_type = value
) known_ds
group by data_source_type
select * from sys.external_streaming_jobs
select * from sys.external_job_streams
SELECT * from sys.external_streams

Hope this can give some help on your delopements. Thanks

--

--

Gibin Francis

Technical guy interested in MIcrosoft Technologies, IoT, Azure, Docker, UI framework, and more