Creating a Custom Datasink for Store and Forward
HistoricalData and HistoryFlavor
HistoricalData
is an important interface for defining all types of historical data that can be passed through Ignition’s Store and Forward system. Before potential data passed into Ignition can enter Store and Forward, it must first be instantiated as a type of historical data or it will not be recognized.
Navigate to the sub folder of your Gateway hook subproject and create a new class to inherit HistoricalData
. In this example, the folder path is com/inductiveautomation/ignition/…/…/gateway
and the new class will be named CustomData
:
package com.inductiveautomation.ignition.database.service.gateway.dataset;
import com.inductiveautomation.ignition.gateway.history.HistoricalData;
import com.inductiveautomation.ignition.gateway.history.HistoryFlavor;
import java.util.List;
import java.util.ArrayList;
public class CustomData implements HistoricalData {
public static final CustomFlavor FLAVOR = new CustomFlavor();
private final List<List<Object>> rows = new ArrayList<>();
public CustomData() {}
public void addRow(List<Object> row) {
rows.add(row);
}
public List<List<Object>> getRows() {
return rows;
}
@Override
public HistoryFlavor getFlavor() {
return FLAVOR;
}
@Override
public String getSignature() {
return "";
}
@Override
public int getDataCount() {
return 0;
}
@Override
public String getLoggerName() {
return "";
}
public static class CustomFlavor extends HistoryFlavor {
public CustomFlavor() {
super("custom");
}
}
}
The first class CustomData
defined above provides the minimum requirements for instantiating a valid structure. When reviewing this code, the most important takeaways are the following:
Extending HistoryFlavor
On the very first link of the CustomData
class, you will notice an instantiated declaration of the class CustomFlavor
where its definition is the second class implemented at the end of CustomData
’s body. Similar to CustomData
, CustomFlavor
is an extension of Ignition’s HistoryFlavor
class and is also a requirement for defining an acceptable historical data type when passed to Store and Forward.
public class CustomData implements HistoricalData {
public static final CustomFlavor FLAVOR = new CustomFlavor();
...
...
...
...
public static class CustomFlavor extends HistoryFlavor {
public CustomFlavor() {
super("custom");
}
}
}
Declaring a rows Field
The second line of the CustomData
class defines a field for storing potential historical data. Notice that it is a two dimensional List
of Object
type. This will allow CustomData
to iterate through datasets containing multiple datatypes per row such as ints, floats, string, datetime, etc. When it comes time to implement the insert of a passed in dataset, addRow(List<Object>)
will be used to pass a valid row of data into an instance of CustomData
.
private final List<List<Object>> rows = new ArrayList<>();
...
...
public void addRow(List<Object> row) {
rows.add(row);
}
Inheriting AbstractDatasourceSink
AbstractDatasourceSink
is the most essential piece of Ignition API that you will need for modules using Store and Forward. In order to pass your own unique sets of data through Store and Forward, you will first need to understand this sub-system’s use of datasinks. Datasinks accept historical data for passing to a database storage or a remote gateway. Each datasink accepts its own specific type of data. For example, Tag History and Alarm Journal Data each have their own datasink. This allows each of their respective historical data to properly pass through Store and Forward before they are inserted into a database.
In this section, you will expand further on this implementation to create a simple datasink from the Gateway hook scope.
Navigate to the sub folder of your Gateway hook subproject and create a new class to inherit AbstractDatasourceSink
. In this example, the folder path is com/inductiveautomation/ignition/…/…/gateway
and the new class will be named CustomSink
:
package com.inductiveautomation.ignition.database.service.gateway;
import com.inductiveautomation.ignition.common.db.schema.ColumnProperty;
import com.inductiveautomation.ignition.common.sqltags.model.types.DataType;
import com.inductiveautomation.ignition.common.util.LogUtil;
import com.inductiveautomation.ignition.common.util.LoggerEx;
import com.inductiveautomation.ignition.database.service.gateway.dataset.CustomData;
import com.inductiveautomation.ignition.gateway.datasource.SRConnection;
import com.inductiveautomation.ignition.gateway.db.DBTranslator;
import com.inductiveautomation.ignition.gateway.db.schema.DBTableSchema;
import com.inductiveautomation.ignition.gateway.history.HistoricalData;
import com.inductiveautomation.ignition.gateway.history.HistoryFlavor;
import com.inductiveautomation.ignition.gateway.history.sf.sinks.AbstractDatasourceSink;
import com.inductiveautomation.ignition.gateway.model.GatewayContext;
import com.inductiveautomation.ignition.gateway.util.DBUtilities;
import com.inductiveautomation.ignition.common.TypeUtilities;
import org.apache.log4j.Level;
import java.sql.PreparedStatement;
import java.util.EnumSet;
import java.util.List;
public class CustomSink extends AbstractDatasourceSink {
private final LoggerEx log = LogUtil.getLogger(getClass().getSimpleName());
protected CustomSink(GatewayContext context, String dataSource) {
super(context, dataSource);
}
@Override
public void startup() {
super.startup();
}
@Override
protected void initialize() throws Exception {
super.initialize();
// verify that your tables exists and that col are correct
DBTableSchema schema;
DBTranslator translator = getDatasource().getTranslator();
log.info("Initializing CustomSink: Verifying Schema for CustomData...");
schema = new DBTableSchema("CustomData", translator);
schema.addRequiredColumn("id", DataType.Int4, EnumSet.of(ColumnProperty.AutoIncrement, ColumnProperty.PrimaryKey));
schema.addRequiredColumn("uuid", DataType.Int8, null);
schema.addRequiredColumn("name", DataType.String, null);
schema.addRequiredColumn("currency", DataType.Float8, null);
schema.addRequiredColumn("t_stamp", DataType.DateTime, null);
SRConnection conn = null;
try {
conn = getDatasource().getConnection();
schema.verifyAndUpdate(conn);
log.info("Initializing CustomSink: CustomData Verified!");
} finally {
DBUtilities.close(conn);
}
}
@Override
public void shutdown() {
super.shutdown();
// cleanup any resources
}
@Override
protected String getDescriptionKey() {
return "CustomSink.Description";
}
@Override
protected void storeDataToDatasource(SRConnection conn, HistoricalData data) throws Exception {
if (data == null) {
return;
}
if (data instanceof CustomData) {
storeCustomData(conn, (CustomData) data);
} else {
LogUtil.logOncePerMinute(getLogger(), Level.WARN, String.format(
"Unknown data type received by custom sink, will not be stored: %s", data.getClass()));
}
}
protected void storeCustomData(SRConnection conn, CustomData data) {
// Do something with the data
log.info("storeCustomData: Preparing Batch Statement for CustomData...");
// create a prepared statement
String query = "INSERT INTO CustomData(uuid, name, currency, "
+ "t_stamp) VALUES(?,?,?,?)";
// and iterate through your custom data
// by calling List<List<Object>> getRows()
// prepared statement, add batch, then when all rows have been iterated, call commit batch, execute batch
try(PreparedStatement pstmt = conn.prepareStatement(query)){
//Set auto-commit to false
conn.setAutoCommit(false);
log.info("storeCustomData: Iterating CustomData for Batch...");
// Set the variables
List<List<Object>> rows = data.getRows();
for(List<Object> row: rows) {
pstmt.setInt( 1, TypeUtilities.toInteger(row.get(0)) ); // Use TypeUtilities.toInteger() instead of (int)
pstmt.setString( 2, (String) row.get(1) );
pstmt.setFloat( 3, TypeUtilities.toFloat(row.get(2)) ); // Use TypeUtilities.toFloat() instead of (float)
pstmt.setTimestamp( 4, new java.sql.Timestamp(TypeUtilities.toLong(row.get(3))) ); // Will use values of unixtime long in milliseconds
// Add it to the batch
pstmt.addBatch();
}
log.info("storeCustomData: Executing/Commiting Batch...");
// Create an int[] to hold returned values
int[] count = pstmt.executeBatch();
// Explicitly commit statements to apply changes
conn.commit();
} catch (Exception e) {
throw new RuntimeException(e);
}
}
@Override
public boolean acceptsData(HistoryFlavor dataType) {
return dataType != null
&& (dataType.isCompatible(CustomData.FLAVOR));
}
@Override
public boolean isLicensedFor(HistoryFlavor dataType) {
return acceptsData(dataType); //Instead of creating a License Class, I will bypass by returning the acceptsData() result
}
}
The code snippet above only overrides the required methods of AbstractDatasourceSink
. When reviewing the implementations of each defined method, it is important to mention some details of the following:
Overriding initialize()
In the method initialize()
, it is checking for a custom datatable CustomData
with specific column header and datatypes:
id
(int)uuid
(int)name
(string)currency
(float)t_stamp
(int,unixtime in milliseconds).
If a table does not exist, it will be created upon module startup.
@Override
protected void initialize() throws Exception {
super.initialize();
// verify that your tables exists and that col are correct
DBTableSchema schema;
DBTranslator translator = getDatasource().getTranslator();
log.info("Initializing CustomSink: Verifying Schema for CustomData...");
schema = new DBTableSchema("CustomData", translator);
schema.addRequiredColumn("id", DataType.Int4, EnumSet.of(ColumnProperty.AutoIncrement, ColumnProperty.PrimaryKey));
schema.addRequiredColumn("uuid", DataType.Int8, null);
schema.addRequiredColumn("name", DataType.String, null);
schema.addRequiredColumn("currency", DataType.Float8, null);
schema.addRequiredColumn("t_stamp", DataType.DateTime, null);
SRConnection conn = null;
try {
conn = getDatasource().getConnection();
schema.verifyAndUpdate(conn);
log.info("Initializing CustomSink: CustomData Verified!");
} finally {
DBUtilities.close(conn);
}
}
Overriding storeDataToDatasource()
In the method storeDataToDatasource()
, incoming data is being checked whether it is an expected instance of the data it is meant to handle. In this example, the condition if (data instanceof CustomData)
compares incoming HistoricalData
with an object type that was discussed in the previous section, CustomData
. CustomData
is defined using methods from the HistoricalData
interface which is how this condition will be able to recognize valid instances.
The method storeCustomData
called inside the body of the condition statement if (data instanceof CustomData)
is a helper method that generates and runs a PreparedStatement
for any sized batch of data accepted by the custom datasink.
@Override
protected void storeDataToDatasource(SRConnection conn, HistoricalData data) throws Exception {
if (data == null) {
return;
}
if (data instanceof CustomData) {
storeCustomData(conn, (CustomData) data);
} else {
LogUtil.logOncePerMinute(getLogger(), Level.WARN, String.format(
"Unknown data type received by custom sink, will not be stored: %s", data.getClass()));
}
}
protected void storeCustomData(SRConnection conn, CustomData data) {
// Do something with the data
log.info("storeCustomData: Preparing Batch Statement for CustomData...");
// create a prepared statement
String query = "INSERT INTO CustomData(uuid, name, currency, "
+ "t_stamp) VALUES(?,?,?,?)";
// and iterate through your custom data
// by calling List<List<Object>> getRows()
// prepared statement, add batch, then when all rows have been iterated, call commit batch, execute batch
try(PreparedStatement pstmt = conn.prepareStatement(query)){
//Set auto-commit to false
conn.setAutoCommit(false);
log.info("storeCustomData: Iterating CustomData for Batch...");
// Set the variables
List<List<Object>> rows = data.getRows();
for(List<Object> row: rows) {
pstmt.setInt( 1, TypeUtilities.toInteger(row.get(0)) ); // Use TypeUtilities.toInteger() instead of (int)
pstmt.setString( 2, (String) row.get(1) );
pstmt.setFloat( 3, TypeUtilities.toFloat(row.get(2)) ); // Use TypeUtilities.toFloat() instead of (float)
pstmt.setTimestamp( 4, new java.sql.Timestamp(TypeUtilities.toLong(row.get(3))) ); // Will use values of unixtime long in milliseconds
// Add it to the batch
pstmt.addBatch();
}
log.info("storeCustomData: Executing/Commiting Batch...");
// Create an int[] to hold returned values
int[] count = pstmt.executeBatch();
// Explicitly commit statements to apply changes
conn.commit();
} catch (Exception e) {
throw new RuntimeException(e);
}
}
Under the try/catch block for PreparedStatement
, notice how the custom data is being iterated per row with different casting methods. For some datatypes in database columns, it is required to use Ignition's TypeUtlities
class for casting instead of Java’s. In this case, it was necessary for setInt()
(name column), setFloat()
(currency column), and setTimestamp()
(t_stamp column).
for(List<Object> row: rows) {
pstmt.setInt( 1, TypeUtilities.toInteger(row.get(0)) ); // Use TypeUtilities.toInteger() instead of (int)
pstmt.setString( 2, (String) row.get(1) );
pstmt.setFloat( 3, TypeUtilities.toFloat(row.get(2)) ); // Use TypeUtilities.toFloat() instead of (float)
pstmt.setTimestamp( 4, new java.sql.Timestamp(TypeUtilities.toLong(row.get(3))) ); // Will use values of unixtime long in milliseconds
// Add it to the batch
pstmt.addBatch();
}
Overriding acceptsData() and isLicensedFor()
A few more additional checks are required using the methods acceptaData()
and isLicensedFor()
. In acceptsData()
, another required condition statement is checked against properties of the object type, CustomData
, which was defined in an earlier section.
isLicensedFor()
is expected to go through a much more complex set of checks with the use of user-defined "License" class (i.e. compare HistoryFlavor
dataype to properties of a custom class containing special “licensing” properties) compared to what is defined below, but for this example, you can bypass both implementing and referring to a "License" class using the same result returned from acceptsData()
.
@Override
public boolean acceptsData(HistoryFlavor dataType) {
return dataType != null
&& (dataType.isCompatible(CustomData.FLAVOR));
}
@Override
public boolean isLicensedFor(HistoryFlavor dataType) {
return acceptsData(dataType); // Instead of creating a "license" class, I will bypass by returning the acceptsData() result
}
Extending AbstractGatewayModuleHook
Once all of the custom classes are defined, the last step is to initialize them as objects in the Gateway hook scope. In this example, an installation of the module with this current implementation will immediately run the following methods below on startup.
Your Gateway hook class should be available to edit on the same scope as the other class definitions. In this example, the Gateway hook class is named DatabaseTestServiceGatewayHook
:
package com.inductiveautomation.ignition.database.service.gateway;
import java.util.List;
import java.util.Optional;
import javax.servlet.http.HttpServletResponse;
import com.inductiveautomation.ignition.common.licensing.LicenseState;
import com.inductiveautomation.ignition.common.project.resource.adapter.ResourceTypeAdapter;
import com.inductiveautomation.ignition.common.project.resource.adapter.ResourceTypeAdapterRegistry;
import com.inductiveautomation.ignition.common.script.ScriptManager;
import com.inductiveautomation.ignition.common.script.hints.PropertiesFileDocProvider;
import com.inductiveautomation.ignition.common.util.LogUtil;
import com.inductiveautomation.ignition.common.util.LoggerEx;
import com.inductiveautomation.ignition.database.service.gateway.dataset.CustomData;
import com.inductiveautomation.ignition.database.service.gateway.scripting.DatabaseScriptModule;
import com.inductiveautomation.ignition.gateway.clientcomm.ClientReqSession;
import com.inductiveautomation.ignition.gateway.dataroutes.RouteGroup;
import com.inductiveautomation.ignition.gateway.model.AbstractGatewayModuleHook;
import com.inductiveautomation.ignition.gateway.model.GatewayContext;
import com.inductiveautomation.ignition.gateway.model.GatewayModuleHook;
import com.inductiveautomation.ignition.gateway.web.models.ConfigCategory;
import com.inductiveautomation.ignition.gateway.web.models.IConfigTab;
import com.inductiveautomation.ignition.gateway.web.pages.config.overviewmeta.ConfigOverviewContributor;
import com.inductiveautomation.ignition.gateway.web.pages.status.overviewmeta.OverviewContributor;
/**
* Class which is instantiated by the Ignition platform when the module is loaded in the gateway scope.
*/
public class DatabaseServiceTestGatewayHook extends AbstractGatewayModuleHook {
private GatewayContext context;
private CustomSink sink;
private final LoggerEx log = LogUtil.getLogger(getClass().getSimpleName());
private DatabaseScriptModule scriptModule;
/**
* Called to before startup. This is the chance for the module to add its extension points and update persistent
* records and schemas. None of the managers will be started up at this point, but the extension point managers will
* accept extension point types.
*/
@Override
public void setup(GatewayContext context) {
this.context = context;
log.debug("Setup");
}
/**
* Called to initialize the module. Will only be called once. Persistence interface is available, but only in
* read-only mode.
*/
@Override
public void startup(LicenseState activationState) {
// Custom datasink startup implementation
log.info("Startup: Custom Datasink");
sink = new CustomSink(context, "MSSQL");
context.getHistoryManager().registerHistoryFlavor(CustomData.FLAVOR);
context.getHistoryManager().registerSink(sink);
}
/**
* Called to shutdown this module. Note that this instance will never be started back up - a new one will be created
* if a restart is desired
*/
@Override
public void shutdown() {
// Custom datasink shutdown implementation
context.getHistoryManager().unregisterHistoryFlavor(CustomData.FLAVOR);
context.getHistoryManager().unregisterSink(sink, false);
}
/**
* A list (may be null or empty) of panels to display in the config section. Note that any config panels that are
* part of a category that doesn't exist already or isn't included in {@link #getConfigCategories()} will
* <i>not be shown</i>.
*/
@Override
public List<? extends IConfigTab> getConfigPanels() {
return null;
}
/**
* A list (may be null or empty) of custom config categories needed by any panels returned by {@link
* #getConfigPanels()}
*/
@Override
public List<ConfigCategory> getConfigCategories() {
return null;
}
/**
* @return the path to a folder in one of the module's gateway jar files that should be mounted at
* /res/module-id/foldername
*/
@Override
public Optional<String> getMountedResourceFolder() {
return Optional.empty();
}
/**
* Provides a chance for the module to mount any route handlers it wants. These will be active at
* <tt>/main/data/module-id/*</tt> See {@link RouteGroup} for details. Will be called after startup().
*/
@Override
public void mountRouteHandlers(RouteGroup routes) {
}
/**
* Used by the mounting underneath /res/module-id/* and /main/data/module-id/* as an alternate mounting path instead
* of your module id, if present.
*/
@Override
public Optional<String> getMountPathAlias() {
return Optional.empty();
}
/**
* @return {@code true} if this is a "free" module, i.e. it does not participate in the licensing system. This is
* equivalent to the now defunct FreeModule attribute that could be specified in module.xml.
*/
@Override
public boolean isFreeModule() {
return false;
}
/**
* Implement this method to contribute meta data to the Status section's Systems / Overview page.
*/
@Override
public Optional<OverviewContributor> getStatusOverviewContributor() {
return Optional.empty();
}
/**
* Implement this method to contribute meta data to the Configure section's Overview page.
*/
@Override
public Optional<ConfigOverviewContributor> getConfigOverviewContributor() {
return Optional.empty();
}
/**
* Register any {@link ResourceTypeAdapter}s this module needs with with {@code registry}.
* <p>
* ResourceTypeAdapters are used to adapt a legacy (7.9 or prior) resource type name or payload into a nicer format
* for the Ignition 8.0 project resource system.Ã’ Only override this method for modules that aren't known by the
* {@link ResourceTypeAdapterRegistry} already.
* <p>
* <b>This method is called before {@link #setup(GatewayContext)} or {@link #startup(LicenseState)}.</b>
*
* @param registry the shared {@link ResourceTypeAdapterRegistry} instance.
*/
@Override
public void initializeResourceTypeAdapterRegistry(ResourceTypeAdapterRegistry registry) {
}
/**
* Called prior to a 'mounted resource request' being fulfilled by requests to the mounted resource servlet serving
* resources from /res/module-id/ (or /res/alias/ if {@link GatewayModuleHook#getMountPathAlias} is implemented). It
* is called after the target resource has been successfully located.
*
* <p>
* Primarily intended as an opportunity to amend/alter the response's headers for purposes such as establishing
* Cache-Control. By default, Ignition sets no additional headers on a resource request.
* </p>
*
* @param resourcePath path to the resource being returned by the mounted resource request
* @param response the response to read/amend.
*/
@Override
public void onMountedResourceRequest(String resourcePath, HttpServletResponse response) {
}
}
Overriding startup() and shutdown
The only overridden methods that need to be noted are startup()
and shutdown()
, since they are responsible for instantiating the CustomSink
and other requirements.
private GatewayContext context;
private CustomSink sink;
...
...
...
...
/**
* Called to initialize the module. Will only be called once. Persistence interface is available, but only in
* read-only mode.
*/
@Override
public void startup(LicenseState activationState) {
// Custom datasink startup implementation
log.info("Startup: Custom Datasink");
sink = new CustomSink(context, "MSSQL");
context.getHistoryManager().registerHistoryFlavor(CustomData.FLAVOR);
context.getHistoryManager().registerSink(sink);
}
/**
* Called to shutdown this module. Note that this instance will never be started back up - a new one will be created
* if a restart is desired
*/
@Override
public void shutdown() {
// Custom datasink shutdown implementation
context.getHistoryManager().unregisterHistoryFlavor(CustomData.FLAVOR);
context.getHistoryManager().unregisterSink(sink, false);
}
In the startup()
method, it is important to note that a call to instantiate CustomSink
uses a field that is declared globally for the Gateway hook class. The call to instantiate CustomSink
also contains a hardcoded datasource name in its second parameter with the string MSSQL. In this example, an existing and valid Gateway database connection, MSSQL, will receive this module installation.
Optional: Inserting Mock CSV Data for Testing
At this point, your module should contain all fundamental backend Java classes for integrating with Ignition’s Store & Forward. In this bonus section, you will be provided with a simple Gateway-scoped scripting function that passes compatible CSV data to the database table named CustomData. The following code closely follows the steps outlined in Ignition’s SDK Programmer’s Guide for creating a simple scripting module.
In order to create the basic CSV import scripting function, you will need to create some classes interfaces, and methods for both the Common and Gateway scopes:
For the Common scope:
CsvUpload
(a public interface): Within theDatabaseServiceTestModule
, implementCsvUpload
.
For the Gateway scope:
DatabaseScriptModule
(a public class extendingDatabaseServiceTestModule
): Within theDatabaseServiceTestGatewayHook
, initializeDatabaseScriptModule
and override the appropriate methods.
Creating CsvUpload
The first step will require a new file of interface type which will contain the initial part of the scripting function. The following code below is fairly straightforward since other classes will be responsible for implementing the details.
Navigate to the sub folder of your Common hook subproject and create a new public interface. In this example, the folder path is com/inductiveautomation/ignition/…/…/common
and the new interface will be named CsvUpload
:
public interface CsvUpload {
public int uploadMockCSV(String filePath);
}
Implement CsvUpload within DatabaseServiceTestModule
Once the interface, CsvUpload
, is defined within the Common hook subproject, you should be able to implement its details within the Common’s main hook class file that should already be available. In this example, the class is named DatabaseServiceTestModule
:
package com.inductiveautomation.ignition.database.service.common;
import com.inductiveautomation.ignition.common.BundleUtil;
import com.inductiveautomation.ignition.common.script.hints.ScriptArg;
import com.inductiveautomation.ignition.common.script.hints.ScriptFunction;
public abstract class DatabaseServiceTestModule implements CsvUpload {
public static final String MODULE_ID = "com.inductiveautomation.ignition.database.service.DatabaseServiceTest";
static {
BundleUtil.get().addBundle(
DatabaseServiceTestModule.class.getSimpleName(),
DatabaseServiceTestModule.class.getClassLoader(),
DatabaseServiceTestModule.class.getName().replace('.', '/')
);
}
@ScriptFunction(docBundlePrefix = "DatabaseServiceTestModule")
public int uploadMockCSV(
@ScriptArg("filePath") String filePath) {
return uploadMockCSVImpl(filePath);
}
protected abstract int uploadMockCSVImpl(String filePath);
}
Extend DatabaseServiceTestModule
At this point, your module should have everything it needs inside of the Common hook scope. This next step will require a new file of class type that will be used to extend the main class of the Common hook scope itself in order to finally define the functionality of our CSV import method.
Navigate to the sub folder of your Gateway hook subproject and create a new public class within its own scripting folder. In this example, the folder path is com/inductiveautomation/ignition/…/…/gateway
and the new class will be named scripting/DatabaseScriptModule
:
package com.inductiveautomation.ignition.database.service.gateway.scripting;
import com.inductiveautomation.ignition.common.util.LogUtil;
import com.inductiveautomation.ignition.common.util.LoggerEx;
import com.inductiveautomation.ignition.database.service.common.DatabaseServiceTestModule;
import com.inductiveautomation.ignition.database.service.gateway.dataset.CustomData;
import com.inductiveautomation.ignition.gateway.model.GatewayContext;
import java.io.*;
import java.nio.file.Files;
import java.nio.file.Paths;
import java.util.ArrayList;
import java.util.List;
public class DatabaseScriptModule extends DatabaseServiceTestModule {
private final GatewayContext context;
private final String dataSource;
private final LoggerEx log = LogUtil.getLogger(getClass().getSimpleName());
public DatabaseScriptModule(GatewayContext context, String dataSource) {
super();
this.context = context;
this.dataSource = dataSource;
}
@Override
protected int uploadMockCSVImpl(String filePath) {
List<List<Object>> csvData = readFile(filePath);
if(csvData == null){
return -1;
}
log.info("uploadMockCSV: CSV Import Successful!");
try {
log.info("uploadMockCSV: Attempting CustomData Insert to MSSQL");
CustomData customData = new CustomData();
for(List<Object> row : csvData) {
customData.addRow(row);
}
try {
context.getHistoryManager().storeHistory(this.dataSource, customData);
log.info("uploadMockCSV: Insert Executed!");
return 1;
} catch (Exception e) {
throw new RuntimeException(e);
}
} catch (Exception e) {
log.error("uploadMockCSV: Insert Failed!", e);
return -1;
}
}
protected List<List<Object>> readFile(String filePath) {
List<List<Object>> csvData = new ArrayList<>();
log.info("uploadMockCSV: Reading CSV Import...");
try {
File file = new File(filePath);
if (file.exists()) {
List<String> data = Files.readAllLines(Paths.get(filePath));
for(String line : data) {
String[] lineArray = line.split(",");
csvData.add((List.of(
lineArray[0], // CustomData.uuid
lineArray[1], // CustomData.name
lineArray[2], // CustomData.currency
lineArray[3] // CustomData.t_stamp
)));
}
} else {
log.error("File: " + filePath + " not found");
return null;
}
} catch (Exception e) {
log.error("File: " + filePath + " is empty!", e);
return null;
}
return csvData;
}
}
Returning to the Extended AbstractGatewayModuleHook
The final steps will require you to return to your main class for the Gateway hook scope. Here you will need to add declarations and overrides in a few places:
package com.inductiveautomation.ignition.database.service.gateway;
...
...
import com.inductiveautomation.ignition.common.script.ScriptManager;
import com.inductiveautomation.ignition.common.script.hints.PropertiesFileDocProvider;
...
...
import com.inductiveautomation.ignition.gateway.clientcomm.ClientReqSession;
...
...
/**
* Class which is instantiated by the Ignition platform when the module is loaded in the gateway scope.
*/
public class DatabaseServiceTestGatewayHook extends AbstractGatewayModuleHook {
...
...
private DatabaseScriptModule scriptModule;
...
...
/**
* Called to initialize the module. Will only be called once. Persistence interface is available, but only in
* read-only mode.
*/
@Override
public void startup(LicenseState activationState) {
...
...
scriptModule = new DatabaseScriptModule(context,"MSSQL");
}
...
...
@Override
public void initializeScriptManager(ScriptManager manager) {
super.initializeScriptManager(manager);
manager.addScriptModule(
"system.customdata",
scriptModule,
new PropertiesFileDocProvider());
}
@Override
public Object getRPCHandler(ClientReqSession session, String projectName) {
return scriptModule;
}
...
...
...
...
}