LUA1: Writing item values to CSV files

LUA1: Writing item values to CSV files

The integrated LUA engine provides also the possibility to read and write to files. This short LUA tutorial shall demonstrate how item values can be written to a CSV file in a defined interval.

To use this tutorial, please do the following steps in advance:

  • Install the NETx BMS Server. The setup can be found here at our website.

  • Create a new BMS server workspace and integrate some data points that you want to store to a CSV file.

A minimal approach for storing item values

Creating a LUA function to store data to CSV files

First, we will create a new LUA function which provides the possibility to write the current value and the current time to a CSV file. Within the toolbar of the NETx BMS Studio, click on the "Edit Script" button.

Afterwards, you can select the LUA file that you want to manipulate. Select "nxaDefinitions.lua" to open the main LUA file of the workspace.

LUA code can be split into different files to organize it. The "nxaDefinitions.lua" is the main file which is loaded by the NETx BMS Server automatically. All other LUA files must be imported by using the "require" statement at the beginning of "nxaDefinitions.lua".

Then start to implement a new LUA function called LogToCsv. You can implement at any location outside the other location. Paste in the following code:

function LogToCsv(item, filename) local value = nxa.GetValue(item) local timeStamp = nxa.DateToString(nxa.Now()) local fp = assert(io.open(filename, "a")) fp:write(item .. ";" .. timeStamp .. ";" .. value .. "\n") fp:close() end

The function takes two parameters as input:

  • item: this parameter specifies the ItemID of the data point which shall be monitored

  • filename: here the name of the CSV file is specified.

The function gets the current value of the specified Server Item and stores it together with the ItemID and the current time into the defined file. ; is used as delimiter. After having finished the implementation, press the "Save" and "Reload" button within the LUA editor in order to reload the newly defined LUA code.

Testing the LUA code

To test the LUA code, select "Execute LUA script ..." from the "Tools" menu within the NETx BMS Server. A small dialog opens which can be used to invoke LUA function directly during runtime. Enter LogToCsv within the dialog and provide an ItemID and a filename as parameter. Within the following example, the KNX group address 5/1/0 of the KNXnet/IP router 192.168.1.36 is used as data point and "C:\Users\wolfgang.granzer\Desktop\test.csv" as CSV file.

Note that the character \, which is used as the delimiter within the ItemID, is a special character within LUA that needs to be escaped. The escape character is also the backslash \. Therefore, two backslashes have to be used within a strings whenever one backslash is required: \\

Execute the LUA code periodically

In order to execute the LogToCsv function periodically, you can use the predefined timer callbacks OnSecondTimerEvent(), OnMinuteTimerEvent() and OnHourTimerEvent(). If you want to invoke the function within an interval different than 1 second, 1 minute or 1 hour, you can use the modulo operator. The following example shows, how the LogToCsv function is invoked every 15 minutes.

--============================================================================== -- OnMinuteTimerEvent() - function is called every minute --============================================================================== function OnMinuteTimerEvent() if (nxa.Minute(nxa.Now()) % 15 == 0) then LogToCsv("NETx\\XIO\\KNX\\192.168.1.36\\05/1/000", "C:\\Users\\wolfgang.granzer\\Desktop\\test.csv") end end

A default solution that stores values of multiple items 

This solution is easy to use and is based on the described steps before. Multiple items can be tracked to create and update CSV files. It even allows to define different datapoint sets that get recorded into separate CSV files. Two steps are required: Adding another file to the ScriptFiles folder of the workspace and setting up one or more configurations that determine what and how datapoints get tracked.

Limitations: Only number, real and boolean values are safe to be used. In case of item with type real, make sure the CSV separator is not . or ,, since otherwise the CSV file may get corrupted.

Steps

The required file is . Put this file into the ScriptFiles folder of the workspace.

Now, the script file has to be included and a configuration has to be created. You can add it to the “nxaDefinitions.lua” file, which gets used and executed by the server on startup or when the script engine gets restarted.

Include the file within “nxaDefinitions.lua” with this.

exportToCsv = require "exportToCsv"

You can append this to the other require statements at to top of the file.

Now you can create an object with some configuration. You can put this code right below the require statement:

exporter = exportToCsv:new({ csvSeparator = ";", csvFirstColumnOfHeaderValue = "Time stamp", valueForBadQuality = "???", valueForInvalidItem = "n/a", targetDirectory = "C:\\Users\\some.user\\Desktop\\csv", operationUnits = { {["start"] = "00:00:00", ["end"] = "12:00:00", ["fileName"] = "values-{DATE}-BeforeNoon.csv"}, {["start"] = "12:00:00", ["end"] = "00:00:00", ["fileName"] = "values-{DATE}-AfterNoon.csv"}, }, trackedItems = { {["name"] = "FirstTrackedItem", ["item"] = "NETx\\VAR\\Real\\Item001"}, {["name"] = "SecondTrackedItem", ["item"] = "NETx\\VAR\\Boolean\\Item001"}, {["name"] = "ThirdTrackedItem", ["item"] = "NETx\\VAR\\String\\Item001", ["modifier"] = function(value) if value == nil then return nil else return ">>> " .. value end end}, {["name"] = "FourthTrackedItem", ["item"] = "NETx\\VAR\\Date\\Item001"}, } })

Nothing happens yet with declaring the object and the configuration. A CSV file will only be created or updated if the execute function gets called. You can add this to the OnMinuteTimerEvent() function:

function OnMinuteTimerEvent() exporter:execute() end

Calling the function execute will create a file. It will include a header line, containing the column names, and a second line for the timestamp and the current values of the tracked items. If the file was already created, just another line will be appended to the existing file, with the current timestamp and current item values.

If the configuration does not get changed, the data for the columns will match the defined columns, so that you get a valid CSV file. If you need to change the tracked datapoints, keep in mind that the data columns of the already created CSV file may not match anymore - the function does not (and cannot) check whether the column names and the number of columns are still valid.

Configuration

Following configuration options are available:

  • csvSeparator: The character used to separate the CSV values.

  • csvFirstColumnOfHeaderValue: Some arbitrary text which gets added as the first column name (which is the timestamp) in the header line. Can be also "" if no text is required.

  • valueForBadQuality: A text that is used if a datapoint has no valid value (??? in Item Tree of the Core Studio. which is the case if the item has no good quality). Can be also "" if no text is required.

  • valueForInvalidItem: If you have specified a datapoint (ItemID) that does not exist and therefore has no value, this text will be used as value. Not existing datapoints do not cause an error. Instead, just this text will indicate about the misconfiguration.

  • targetDirectory: The directory where files will be added.

  • trackedItems: A table with some arbitrary names and its corresponding datapoints, which shall be observed. The names are visible in the header line of the CSV file. The order that you use here will determine the order of the columns in the CSV file. Each entry must have name and an item element. A modifier element is optional and allows to manipulate the value that was read from the item: Define or provide a function with a single parameter that returns any value (define the function before its usage, i.e. in the lines above where you need it!). In the example above, the tracked string item prefixes each value with >>> . The modifier also allows to define the number of decimal places of a number or things like that. Consider that the incoming value might be nil when you create a function.
    Example of how to convert Real values with to a number with two digits after the dot:

    function formatReal(value) if value == nil then return nil else return string.format("%.2f", value) end end -- Within configurations the function can be re-used ... trackedItems = { {["name"] = "Item001", ["item"] = "NETx\\VAR\\Real\\Item001", ["modifier"] = formatReal}, {["name"] = "Item002", ["item"] = "NETx\\VAR\\Real\\Item002", ["modifier"] = formatReal}, } ...
  • operationUnits: This defines the file name of the CSV file and also the time when a CSV file can be created or get updated. Like you can see in the example above, you can define different times and file names. With that, you can create different files, e.g. for the first and the second half of a day.

    • The file name may contain the placeholder {DATE} which will be replaced by the date at the time of execution. Using this placeholder will create a new file every day.

    • You need to have at least one operationUnit e.g. like this: {["start"] = "00:00:00", ["end"] = "00:00:00", ["fileName"] = "values-{DATE}.csv"}. This will record item values at any time when execute gets called. The first call in a day will create a new file with the current date, so that you get each day another CSV file, only differing in the file name by the date. Each unit requires start, end and fileName declaration.

    • When execute gets called, the function will search for the first operation unit where the current time falls in between the defined start and end time (24h notation). Following units will be ignored. This determines the file name that gets used. If no unit is defined or no unit has a time range where the current time falls into, nothing will happen. So you can turn off recording at certain times by this means.

Multiple configurations

You can have multiple configurations that work independently. This is an example:

exporter1 = exportToCsv:new({ csvSeparator = ";", csvFirstColumnOfHeaderValue = "Time stamp", valueForBadQuality = "???", valueForInvalidItem = "n/a", targetDirectory = "C:\\Users\\some.user\\Desktop\\csv", operationUnits = { {["start"] = "00:06:00", ["end"] = "14:00:00", ["fileName"] = "working-shift-1-{DATE}.csv"}, {["start"] = "14:00:00", ["end"] = "22:00:00", ["fileName"] = "working-shift-2-{DATE}.csv"}, }, trackedItems = { {["name"] = "FirstTrackedItem", ["item"] = "NETx\\VAR\\Real\\Item001"}, {["name"] = "SecondTrackedItem", ["item"] = "NETx\\VAR\\Real\\Item002"}, {["name"] = "ThirdTrackedItem", ["item"] = "NETx\\VAR\\Real\\Item003"}, } }) exporter2 = exportToCsv:new({ csvSeparator = ";", csvFirstColumnOfHeaderValue = "", valueForBadQuality = "", valueForInvalidItem = "", targetDirectory = "C:\\Users\\some.user\\Documents", operationUnits = { {["start"] = "00:00:00", ["end"] = "00:00:00", ["fileName"] = "values-{DATE}.csv"}, }, trackedItems = { {["Temparatur"] = "FirstTrackedItem", ["item"] = "NETx\\VAR\\Real\\Item004"} } })

These are two configuration that can be triggered independently. As noted before, this is just the configuration where nothing will happen. You can now call execute, e.g. within the function OnMinuteTimerEvent(), but at different schedules:

function OnMinuteTimerEvent() if (nxa.Minute(nxa.Now()) % 10 == 0) then exporter1:execute() end exporter2:execute() end

This will track the items defined in exporter1 every ten minutes, and the items of exporter2 each minute. Note that this is only one way to trigger the execute function. You can trigger also with XCommand event definitions or Task definitions, what might be necessary if you would like to activate the export e.g. only for certain days of the week (the script only can be configured for certain times, but not for days).

For testing purposes you can also trigger it with Execute LUA script ... from the Tools menu in the Core Studio:

image-20250407-124516.png

Note: Do not forget to save script files when you have edited them and to restart the script engine like it was described above!