Sunday 4 September 2016

ArcoFlex Releases IoT Prototyping Unit


In a world first, Arcoflex Automation Solutions releases the only true end-to-end solution for piloting and prototyping IoT remote monitoring and control solutions. Sign up and receive data in minutes. There are so many moving parts to an IoT remote monitoring solution that it can be quite bewildering just to get started. What Arcoflex has done is give you a fully functioning, universal monitoring solution in a box. One that will work straight away.
 
Be fully operational and monitoring data in just 10 minutes!


This is a truly unique offering, born of the frustrations of not having coding examples and the relevant infrastructure experience. We wanted to give companies a way to start playing with remote monitoring NOW and not after months of frustration and fruitless experimentation. Based on Microsoft’s Azure application environment, the expansive IoT Suite, everything is put together for you to use right away
Q. How easy is it to use?
Very easy. Here is all that you need to get started:
· Some sensors (or use the samples provided)
· A local area network connection or a 3G/4G dongle
· A web browser to review your results
· 240V and patience

We provide detailed instructions for what is going on behind the scenes and how to connect sensors. Failing that we have a 24/7 help desk ready to get you on your feet. The developer kit is provided in a portable carry case but the breadboard can be taken out and used directly if required. 
Q. Why Does It Work, why is it so easy?

Good question! It works because we’ve written all the code for everything. EVERYTHING. From the I/O integration on the PI, through the web jobs connecting IoT hubs and event hubs, all the way to data storage and visualisations. You just need to configure your prototype and you’re running. In fact, we’ve even pre-loaded a sample so that you can see it operational the moment you turn it on.
Q.         How do I Prototype with it?
This is the easy bit. We provide you with two modes of operation:
Mode 1 – Hardware Prototyping
There are three coded I/O boards in the standard configuration but we can customise a box for other implementations if required. Just ask. You will be given wiring instructions for digital, analogue and one-wire sensors. Please consult the product specification details but you can connect up to 100 devices. All standard industry devices are supported. You can plug and play in just minutes and configure your data collection faster than the coffee pot will boil. We have provided some common sensors to get you started.  Our staff can help with finding the addresses of one-wire devices or the input range of analogue devices. The digital I/O card provides a power supply for switches or CT sensors. Output (for control purposes) IO cards can be supplied on request.
Mode 2 – Software Prototyping

We have provided sensor templates for around 30 common sensors but you can create more for yourself. When placed in demo mode, a configured sensor will send pattern data, whether or not a physical sensor is attached. The purpose of this is to allow the UI and BI or analytics teams to evolve their software whilst waiting for the hardware guys to get their act into gear. So the displays, alerts and dashboards can be prototyped independently of the hardware. This is a unique capability not available anywhere else. Build your end user applications in parallel with the hardware. In fact, your configuration parameters will help the hardware teams refine their sensors.

 
Q.         What does it cost?
The developer kit costs a mere $950 plus shipping and handling. It comes with a one-month trial subscription for proof of concept purposes. Thereafter there is a monthly cost of $30 per device for an unlimited number of sensors. We offer 3, 6 and 12 month subscriptions, paid in advance. The prototyping edition has a limit of 20,000 messages (up to 100 bytes) per day but we can provide higher throughput for a small additional fee. Subscriptions can be suspended and restarted without penalty.
Q. How does it work?
What we have done is written the code required to handle every aspect of the solution. Everything is completely data driven and totally dynamic. Once you have evolved your own prototypes you will be able to replace our components or perhaps engage us to provide them. We will probably be able to do this at a vastly lower cost than you attempting to build the expertise internally. The following diagram outlines the architecture involved.
The application environment is divided into six zones:
Zone 1 – Raspberry PI3B with Windows 10 IoT Core
We have written a Dot Net app for the PI which is integrated with our supplied I/O cards. This is greatest
point of complexity for prototyping: managing the connectivity of device with the operating system then working out a collection and transmission protocol that works. We have done that for you and the flexibility we’ve created is extensive. Basically any sensor can be connected to our cards and all that is needed is some configuration metrics: signal range, physical address and other necessary parameters. All this is detailed very simply in the accompanying instructions.
Zone 2 – Data Converter
We use Microsoft Azure IoT Hub to collect incoming data streams. Because our Raspberry PI code uses a proprietary data compression protocol, the data converter web job converts our protocol back to JSON and inserts it into an Event Hub.  This element is required but from here, any other Event Hub consumer can access this data: Hadoop, Stream Analytics, BI and so on.
Zone 3 – Db Sync
The data Sync job reads the Event Hub and writes to two SQL Azure Databases.  The first is effectively the current value for all sensors and devices, with the second being a history table.  This is done for visualisation performance.  The history table is used for data trending and other analytics purposes and will contain millions of rows.
Zone 4 – Alert Manager
Set up in its own web job, the alert manager inspects all data in terms of configured alert settings.  SMS, email and simple logging options are available.  There are rules for re-alerting so that you don’t end up spamming yourself.
Zone 5 – Data Gateway
The data gateway is simply a web service designed to read data from the Azure SQL DB. It was made a web service so that is can be scaled automatically but also so that web apps as well as desktop apps had equal access to content. Think of this layer as a wrapper to the Azure DB.
Zone 6 – Configuration Application and Dashboard
We provide two applications to support this environment. The first and most important is the configurator. This application, a remote desktop services app, configures all the sensors on the device, their limits, alerts and trending details. You can configure your entire pilot environment with the configurator, whether or not you have real sensors in place yet. The second application is a dashboard and trending app so that you can visual the data being collected. Even though this is developer solution, all data is being held until the user decides to remove it.
Q. What might the Bandwidth Charges Be?
 
If you decide to use 3G/4G as a connectivity medium, then you need to understand bandwidth costs. Our transmission protocols are unique and innovative but it is still easy to blow the budget if you are not careful. You could easily just go out and purchase a pay-as-you-go account and start the same day but be careful to restrict data transmission parameters with the configurator. If you are careful, this could cost you less than $15/month but please speak to us if you have higher bandwidth needs. We can provide shared SIM card plans probably a lot cheaper than you can yourself.
Q. How do I integrate Control requirements?
This is a customisation add-on because we really need to know what you want to do. Control can be digital (on or off) or a variable value (like for a thermostat). It isn’t a big cost – in fact it is small – but the variability of what is possible requires us to discuss this with you first.
Q. How do I get started?
Jump on to the Arcoflex website and order a kit. Delivery is approximately 5 days from payment unless customisations are required. Replacement parts, additional sensors and other accessories are available on the website.
DON’T PROCRASTINATE, DON’T DELAY
GET ARCOFLEX SENSEI!
BUY ONE TODAY
 




Friday 2 September 2016

ArcoFlex Sensei is Different from the Rest


Remote monitoring solutions are beginning to make their presence felt. They can be really sophisticated and really expensive or they can be ArcoFlex solutions. To be fair, if you need to monitor and control complex industrial PLC based solutions, then full two-way VPN tunnels and specialised management solutions are probably necessary. But if your requirement is device and sensor monitoring, the remote and automated collection of data and some direct feedback control, then a sophisticated and expensive solution is not necessary.
 
ArcoFlex Sensei brings to you an affordable end point solution or prototyping unit that will get you right into the IoT remote monitoring and control space from Day One. Here’s how Sensei is different from the rest:
1.     High Performance.   On chip processing to reduce server side workload for superior performance and data efficiency. By moving process logic down to the chip we have ensured better scalability at the trending and data management sides.
2.     Offline Data Collection.   Sensei has 16GB on-board RAM to store up to 28 days of data in the case of lost communications or hardware failure. Hence there is no loss of trend data.  Sensei has an auto burst mode to recover this data once communications have been restored.
3.     Virtual Sensors.   Sensei introduces the concept of virtual sensors which can aggregate other sensors (real and virtual) to produce combined logical or value outcomes.  Examples include: air locks (two door switches), pressure differentials (two pressure sensors), converting pump actuations to delivered volume and many more. The key is to put the conversion processing effort down on Sensei so that trending and visualisation retain universality and high efficiency.
4.     Application to I/O Coupling.   Our Sensei application code is intimately coupled to the I/O cards in such a way that simple electrical connections can be made for sensors and digital inputs.  This perfectly matched coupling is what makes Sensei’s superior to work with because we have taken out the complexity of the endless sensor types and how they might be configured electrically. Our I/O cards and configuration portal can be used immediately with almost any digital, analogue or 1-Wire device. RS-232 is available on request. All documentation is provided.
5.     Configuration Ready.   Sensei comes with a fully visual configuration environment set up your sensors in just minutes. Test, play and control and see your results in seconds.
6.     Output and Control.   Sensei is probably the only IOT remote monitoring solution to also offer practical feedback control.  You can have two types:  Digital – for relay switching or Analogue -  for rheostats or thermostats.  This feedback can be configured on-chip for instant events but it is also configurable so that it becomes reactive, setting up values based on pre-set conditions.
7.     Visualisation Ready.   Default Trending and visualisation options are provided right out of the box.  Your Sensei device can be collecting data within seconds of connection. All that is required is a small amount of configuration to set value limits for display.
8.     Prototyping.   Sensei has a completely automated trial and prototyping mode which allows you to define and configure sensors without them being physically attached.  This allows you to build UI’s, data aggregation services, dash boards and BI tools in parallel with your hardware development.  It also saves substantial costs by being able to replicate real life scenarios.
9.     Alerting.   Alerts and logging are all managed with device side logic, making them genuinely capable of real time responsiveness. Our email latency is approximately 10 seconds on top of the default one second sensor latency. SMS latency is approximately 2 seconds.
10.  Real Time Data flow.   Sub-second response times are routine for most of our installations. Enjoy the only IOT remote monitors and control solution to provide this. To be fair, this can come at a bandwidth cost but if genuine real time notifications are critical to your application, Sensei can provide this. Contact us to discuss the implications.
11.  GSM Enabled.   All Sensei devices are equipped with a GSM chip which reports its location back to the configuration portal and dashboard. This means we can map your devices to the screen and help identify sites for maintenance attention or alarms. “Hothouse 2” can be a mystery at 2.00am when the alarm goes off but when you can see and walk to the correct building with your tablet in your hand, life is a lot easier. Feed mills use this feature to set route planning for silo refilling.
12.  Competitively Priced.   Sensei has an average base product price around the $1500 mark and then sensors to suit the implementation and needs of the customer. Usually a few hundred dollars or less, this is a very attractive price. There are installation costs to consider, as there are with any monitoring solution and a monthly charge of around $30 per Sensei unit. There are no other Telco or third party charges. Just one bill for everything.
Choose ArcoFlex Sensei for your next remote monitoring requirement.

Real Time Data with Arcoflex Sensei and Microsoft Azure



ArcoFlex Sensei – Real Time Data with Azure IoT
 
Remote monitoring seems to have two completely diametric applications: offline and therefor latent storage of data and real time data. Latent data is fine. Collect, store and forget. No big deal: easy to obtain and cheap to collect. So let’s forget about this because everyone can do it. But what about real time. How ‘real time’ can we get, what does it cost and is it genuinely possible. Fortunately, Microsoft’s Azure IoT suite can provide exactly that.


Collecting data from remote sensors poses two problems: how fast can it traverse the various components of the cloud and what will it cost? Let us review the performance issue first and then analyse its impact on cost. If it is too expensive, we may not be able to accept the performance we desire. In order to get data from a device to a visualisation, these are the elements it must traverse:
·         Code at the sensor’s manager (Raspberry PI in our case)
·         IoT Hub where completely scalable input is available
·         Event hub where normalised JSON data needs to be available
·         SQL Azure DB where trend data is being persisted
·         Visualisation applications as a user end point
Arcoflex has established that traversing this entire gamut of processes and storage mechanisms can be as short as 600ms. This means that from the moment a sensor has recorded a value or assessed an alert condition, the resulting data is creating an SMS or email or updating a visualisation within one second. So we are able to boast sub-second responsiveness and can therefor claim real time data responses. This is a big call but it is quite easy to prove and demonstrate. Our Arcoflex Sensei Protyper can do this out of the box.
Is this important? Yes, it is. Bio-security breach conditions are one example where it is required under law to prove that two doors are not open at the same time. You cannot do this if your data recording solution cannot resolve to less than a second. But there are other examples: motor burnout prevention, temperature control, fluid mixing processes, alarms… the list is actually quite endless.
The big question is of course: how? The answer comes in two parts: an innovative time and value compression algorithm from ArcoFlex and Microsoft’s Azure platform. The two components combine to yield an utterly cost effective solution to provide real time data from your remote monitoring solution. The first part is ArcoFlex’s TVDC compression algorithm. What it does is to only send data that changes, set to configurable precision and frequency metrics. Because the 3G/4G solution environment sits directly on top of the sensors, this is what controls telecommunications provider costs. Keeping this low is important to the overall cost implementation.

The difficulty with customised transmission protocols however is one of incompatibility. No-one else can read it. So what ArcoFlex does is read the compressed data into Microsoft IoT Hub, a mechanism that has enormous capacity and almost infinite scalability. But it has a cost and needs to be managed so our compressed data packet management protocol ensures that IoT Hub expenses are minimised, as well as carrier costs for the 3G/4G connection. The targeted aim is less than $10/site per month for effectively unlimited amounts of trend data.
The problem though is that nothing else will read the compressed data so the next step is to expand the data set into full JSON, something almost everyone can consume. By contrast to IoT Hub, Microsoft’s Event Hub is very cheap. So we take all the IoT Hub traffic, expand it into JSON and throw it at Event Hub. Now the data is in a format that can be consumed by regular BI and Stream Analytic tools or dumped into a SQL Azure table. Suddenly everyone has access, including our own visualisation tools.

We have net latency down to sub second and this is impressive. Bandwidth costs are contained without compromising overall system performance. If your IoT monitoring solution is not yielding genuine real time responsiveness, then it isn’t using Arcoflex Sensei.

Geoff Schaller
@ArcoflexIOT

Tuesday 16 August 2016


IOT Data needs to be SMART, not Big
 

 Two of the most over hyped concepts out there right now are IOT and Big Data. Every time I hear people using the term Big Data I shudder. They have no idea what they mean and no idea what they are going to do with it. All they know is that Gartner and others tell them it will save the world and create universal peace and prosperity. Yeah, right… And fast food isn’t fattening?
Now mix IOT in with this. I have seen and read so much rubbish that the veritable explosion of devices ^ is going to create so much data that we will be swamped. I don’t know what this means: will Western civilisation collapse or if we yawn long enough, will it just go away? Or will we need gum boots to push our way through?
But here is the problem and yes, IOT is the cause. A whole bunch of companies are racing off to build devices that are going to project data into something somewhere that someone somehow is going to visualise and goggle at endlessly. Apparently. So yes, I can set 1000 sensors on 50 farms to start sending data off to some cloud based storage somewhere. Repeated thousands of times over everywhere in the world, these little sensors are going to start pumping data at some massive rate (seconds? minutes? hours?) into cloud storage. How incredibly stupid and how incredibly wasteful.
We need SMART data, not big data and anybody who tells me they are going to create tonnes of data from their remote monitoring solutions is confirming their complete lack of credibility in this space. For data to be useful it needs to be relevant and accessible. Terabytes of unstructured drivel is still drivel. So in order to save the world from collapse we have worked on a SMART data solution:
Sparsely MAnaged Relevant Transactions
First, we need to be SPARSE.
Let me give you an example. You are monitoring an airlock and you need to know when a door is opened and how often. You might even want to know how long it is open for bio-security rating purposes. Your sample frequency is per second because the open door will trigger alerts. So if the door is shut, I don’t need to be told that over and over. As long as a heartbeat exists, I only want to know when the state changes – ie the door opens or closes. Another example: a cool room temperature which we monitor once a minute. Under normal conditions, the room will chill to within 1.5C and 4C. if the room temp doesn’t vary more than (say) 0.5C then I don’t want to know about it. Don’t send the data! It can easily be extrapolated at the storage end without fuss or complexity. Save the bandwidth and storage cost.
Secondly, the process needs to be MAnaged
Data has to be controlled at the source. It is too late once it gets into the hubs and storage pools. This is my problem with most LoRA solutions. They just pump data regardless because that is all they can do. There needs to be some intelligence at the device in order restrict outgoing packets to be sparse data and not big data. My contention is that the sending gateway must have application code and an operating system in order to manage what ends up in our IOT solution space. Managed devices are just as important to bandwidth and storage control as sparse data protocols are.
Thirdly, data needs to be Relevant

We must not send data just because we can. That is just vain: the thought that somebody one day might want it. Phooey. We need a reason for the data we capture if for no other reason than there is a cost to storage and a cost for processing. Let’s go back to our cool room example. We know that the thermostat should set the room to 1.5 – 4C but it is misbehaving. The repair man decides to change the precision to 0.1C and get readings every second instead of every minute. He wants to watch how fast the compressor pulls down the room and how fast it warms up. You see, we don’t really need this data for the last six months, just now when a problem has been detected. Once the problem is fixed I revert back to minute readings on a lower precision. We could always collect this data but that just ignores the cost and performance hits we suffer as a consequence. We need to collect relevant data, not everything just because we can. This is the bower bird syndrome on steroids. One more comment: this process of changing the capture parameters could be automated, if you think about it, assuming the device is managed. The repair man will take a profile of the data so that he can review comparative performance down the track. It is idle wastefulness to collect that data continually. It also means you don’t understand your system’s basic metrics.
Finally, we must think of our data as being Transaction based

So what do I mean by that? Not only should data be sparse, managed and relevant but it should be part of some logical business process. It should belong to a business transaction, not just recorded ‘because’. For example, if I am monitoring the pressure differential across a filter system, the differential value will have levels to create alerts which are aimed at advising me when to replace the filters. Transactions accumulate into trend sets and usually have events that are generated from level settings. Yes, it might be interesting to watch the pressure differential but it isn’t useful if it cannot result in a practical transaction of some kind. The important outcome here is how often I have to replace the filters, not the pressure differentials leading up to the event. Ok, that data can be predictive in itself but after a few months, I have data for proactive replacement rather than waiting for failure. Note here how the transaction actually modifies with experience? It is all about being smart and not just reactive. This is how we give our customers true value and it doesn’t derive from big data at all. Just smart data.

In summary, we need to collect sparsely managed data that is relevant and transaction oriented. Obviously we need to know why we want it and what we are going to do with it.
This is where most IOT solutions go wrong. They go out and collect data then look for a problem to solve with it. Data without relevance and without a transaction outcome. Ladies and gentlemen, turn this around. Go out there and find problems that you can build an IOT solution to solve.
Geoff Schaller
@ArcoflexIOT

Note: ^Gartner again, 50 billion by 2020, although the number changes more often than the font

ArcoFlex Sensei – Creating Virtual Sensors

 

 

ArcoFlex Sensei introduces a brand new concept to the IOT market: virtual sensors. Not possible on LoRA or single device gateway solutions, Sensei can combine one or more real sensors into virtual sensors to represent real data or conditions that are dependent on multiple real world inputs.
 
What a virtual sensor solves is important. Taking the example of a bio-security airlock, there are two doors with door switches and the airlock is in breach if both doors are opened simultaneously. Without virtual sensors you are constrained to writing code in the visualisation layer or stream analytics layer which constantly monitors the outputs of two sensors, the two door switches. Not only do you have to customise the code each time you need this implementation, the processing workload is now on the server side and scaling is a problem. If 1000 nurseries need an airlock, the server has to continually monitor this process for a 1000 sites. It is a massive performance hit and an unwanted complexity.
Enter ArcoFlex Sensei and a virtual sensor. The Raspberry PI code will create a sensor from the combined input of both door switches and report breach data like any other sensor. This is a simple Logical AND where both doors open creates a breach and even the length of time in breach can be measured in the visualisation. All processing is done on the remote PI so now scaling is not a problem. The dashboard merely displays data as if it were another sensor.
The key to sensor and device performance is in making the 1000’s of PI’s do the processing rather than the visualisation layer or the server. The resulting data stream becomes raw data just like any other data stream. This also allows for universal coding in the visualisation layer because the output is treated just like any other sensor.
So far, we have come across several very useful implementations:

·       Hothouse Airlock – At the Toolangi Seed Potato farm, bio-isolation is required in order to obtain certification to sell into Queensland. Whilst the doors can be rigged with alarms locally, Sensei records breaches (or lack of) and provides documentation for compliance purposes.
·       Filter Differential – in our standard irrigation monitoring solution there is a pressure sensor either side of the micro-filter. A virtual sensor is created to monitor the difference in pressure across the filter. Once it gets too great, a new filter is ordered and a replacement scheduled. Again, proactive alerts are generated to warn of pressure deterioration.
·       Irrigation line Leaks – when the line is pumping, there is a certain back pressure that is normal. If the outward leg pressure drops too much when the pump is running, there is a leak that must be attended to.
·        Pump Run Confirm – the float switch for refill can be actuated but if the UV sterilizer breaks, the pump is disconnected. This means that the float switch alone isn’t enough of an indicator that the pump is actually running. This virtual sensor might check three inputs to determine if it is actually running.
·        Fungal Growth Profile – there is a relationship between humidity and temperature that creates conditions ideal for fungal growth in a hothouse. The formula for this relationship can be coded into the PI so that raw data can be fed to the customer as a percentage of ideal.
·       Heartbeat – based around a simple timer, a single byte can be sent through to the dashboard confirming that comms and data channels are all in order. This is critical where you have data which of its nature, changes only very slowly or not at all for long periods. The heartbeat allows you to very economically detect the overall health of your sensor array.
·       Trending Patterns – perhaps you are monitoring air pressure or temperature and you have a need for a dashboard gauge which simply shows rising/steady/falling or drier/wetter and so on. Quite often, this fact is a metric of itself or can be used to combine into alerts and notifications. Defining these at the PI level solves all those scaling issues mentioned and ensures the output is recorded as data points of themselves.

With ArcoFlex Sensei, you can build virtual sensors to represent quite complex relationships between other sensors. These virtual sensors generate real time data in their own right but keep all the processing down at the sensor gateway. This makes for fully scalable business logic solutions to generate compound data without processing overheads.

But there is one other use for virtual sensors: output! Being able to control digital or analogue output is a vital part of remote monitoring and control. This isn’t just internal feedback, important as that can be, this is configurable control. Implemented with simple relays, a digital output could be instructed to turn on or off a motor, a fan, a compressor, a pump or in fact anything that can take a switched power supply. With a little bit of extra wiring we can also make an analogue output (something like 0V to 5V) to control a potentiometer. An example of this might be a thermostat or a stepping motor.
The virtual sensor gives us a structured way to provide configurable metrics to a digital or analogue input. Expect this to be the growth zone for IoT going forward. Well, once they’ve solved all the other issue first.
Geoff Schaller@ArcoflexIOT

ArcoFlex Sensei – Solving the IOT Sensor Problem
 
The IOT space – the bit that talks about remote monitoring and control (and most people ignore the control bit) - is full of companies scrambling to provide sensors and devices, vainly hoping someone will know what to do with them. The problem for the solution developer is that none of them are compatible and all of them are different. So how is the average business to navigate this sea of differences? This is where ArcoFlex Sensei comes to the rescue.


Add to that the fact that everyone will tell you WHAT you should be doing, very few, if anyone, actually tells you how. Whilst it is easy enough to code visualisations and dashboards once you have data, how on earth do you get the data out of a sensor and into an event hub in order to extract it down to a visualisation. Let’s look at the problem:

·         you have sensors and now need to wire it to “something” to extract the data
·         Your sensor gateway needs to be able to address the sensor and extract the actual data
·         There is scaling and limits and DAC conversion to be concerned about
·         Digital inputs need some way to get their value into the gateway
·         Code is needed somewhere to find these values, connect with the Hub and send data
Each of these sensors needs to connect to the central gateway computer via an IO card. Unless you’re an electrical engineer or hobbyist with some experience, this is going to be a massive problem so let’s go over the sensor types you will encounter:

One-Wire.          One-Wire devices have an address and you need to determine (or read from the packet someone binned 6 weeks ago). Once you have the address your gateway code will simply read the value in the form of a string. No conversion necessary but you will get 86 decimals places if you aren’t careful. One-Wire is good because the sensors are very cheap and quite accurate. Code is need to loop around all possible addresses each time you want a value.
Analogue.           To confuse matters, there are 4ch and 8ch analogue devices and they scale differently. 4ch sensors are vastly more accurate but you can only have 4 per IO card. They can be either current oriented (4ma) or voltage oriented (5V). they may or may not have a zero setting as an offset. This will further reduce resolution. 8ch analogue sensors can only provide 0-255 as a maximum resolution and if there is an offset, this is often 25% smaller. You need to know all of these characteristics so that you can correctly scale the resulting value.
Digital.                 To operate a digital sensor, you are required to supply a voltage (or 0V) to a set of pins that can be read as on or off. This means providing power to the circuit being switched or sensed.  The same goes if you are supplying a signal back. You will now be operating a relay to switch something on or off that will carry current to the device being controlled.
So the IO cards used to connect sensors need to have compatible input channels and if there are multiple cards, dip switches set to addresses to the cards. The code in your gateway device needs to accommodate all this and try as we might, the smaller Arduino Yun boards just couldn’t carry the programming. This was especially true once we came to the conclusion we needed to manage and compress the collected data. Our conclusion was to move on to a Raspberry PI3 B processor board and Windows 10 IOT Core as an operating system. Linux would have worked too but we’re a Microsoft shop and the attraction of using one language for the entire code base was just too attractive.
So ArcoFlex Sensei was built to solve all these sensor related problems. The basic model includes the following:
·         Raspberry PI3B with 1GB RAM and 8GB SSD card
·         W10 IOT Core and an application to manage the IO cards, all comms and management
·         One one-wire IO card (256+ sensors possible)
·         One 4 channel analogue IO card
·         One Digital IO card with up to 8 inputs
·         A power supply
·         A 4G gateway router and/or RJ45 LAN socket
·         Instructions (Yes!) on how to wire in your sensors
With just a few bits of wire, some pliers and a soldering iron, anyone can wire in any sensor they choose. $5 temperature sensors are a good start but we have taken all the complexity out of knowing how to get data out of a sensor and onto the web. If you obtain one of our starter kits we can also provide the back-end code to visual your sensor data through to a dashboard and even generate alerts.
With ArcoFlex Sensei, you can build a proof of concept device in just a few minutes.

Geoff Schaller
@ArcoflexIOT

Sunday 19 June 2016

Where IOT is Going Wrong


Where is IOT Going Wrong ?
 We’ve all seen the predictions. Apparently IOT will be bigger than sliced bread, Ben Hur and everything in between. My earlier blog referred you to the hype curve and explained some realities but let me expand on why I think many companies will go broke or simply fizzle out. It boils down to this:

No-one is thinking clearly about the problems IOT should really be solving and how.
It comes back to the definition of IOT: what is it really? Once you strip out web sites, remote applications, phones, tablets, email and server to server functionality you are left with the “things” that IOT is really referring to - remote monitoring of small or single purpose devices. And this is where the concentration of IOT development is taking place. Let me explain where I believe it has all gone badly astray. There are four key problems I would like to explore with you:
·         An over emphasis on the value of LoRa
·         An inadequate understanding or visibility of true cost
·         Little genuine perception of IOT framework complexity
·         A complete lack of genuine applications making sense of it all
1.            The Inadequacy of LoRa
I almost wish no-one had thought of this: low powered WiFi devices that send tiny bits of data on an infrequent basis. Almost useless. 12 byte messages and 144 per day limits just won’t cut it for 90% of the remote monitoring requirements out there. There is no alert immediacy possible, no genuine data trending possible and no aggregation of multiple sensors possible. We can’t send commands back to the device or sensor and we can’t put intelligence at the point of collection. Even in agricultural scenarios, there are very few practical uses for this. You might think soil moisture or silo levels might be a viable sensor usage for LoRa but this only makes sense if you have software to aggregate and manage the readings off line. But it is useless for weather stations, water monitoring, hot house monitoring, stock monitoring and almost everything else. It is no good for building event monitoring, patient and medical alerts and just about any machine monitoring. LoRa really isn’t a useful data transmission methodology except in all but a very few specialist areas. Keep clear of it unless you really know how you are going to deal with the lack of data flow. Please make sure you know what LoRa solutions will NOT do for you.
2.            Nobody Told Me It Would Cost this Much!
The cost of solutions I see being constructed and pedalled over the web is going to horrify people and scare off any genuine take-up. Part of this is because software vendors are wrapping up solutions independently to hardware vendors. This always means two people need a cut and you will have two sides of the equation to deal with. But, did they tell you what the monthly costs will be? There are going to be Telco bandwidth charges and then Azure (or Amazon or Google or…) framework charges. Just dealing with data is fraught. 3G/4G data costs are still expensive the world over and if your solution doesn’t act nicely over the link, you will be up for data costs you didn’t expect. If you start from the software side (someone selling you a data solution), make sure you understand the cost of sensors and probes. Most websites will not detail all this up front because they know you will panic when you see the numbers.
3.            The Complexity is not for the Feint Hearted
If you are doing this yourself then be ready for the complexity and cost of collecting data for display. Making 24/7 auto scalable solutions is not easy on any of the Big Three platforms, despite the spin and hype they throw at you. For what it’s worth, only the Microsoft platform comes close to offering an API approach but this aspect of the industry is very immature. The platforms are not stable and their APIs have significant bugs. Your client is not going to listen to you telling him that Microsoft broke your connection last night and that’s why you didn’t get 12 hours of important data. The other complexity is having the software cooperate with the hardware in order to yield meaningful stats and alerts. Unfortunately, it requires a symbiotic relationship between the managing and reporting software and the hardware offering up the data. You simply will not be able to pick and choose from different vendors. This will dramatically complicate the design and installation of IOT systems.
4.            Where are the Apps!
This is my big question. Right now, they just aren’t out there. Basically we’ve put the cart before the horse. We need to know what we want and why before we go around spending millions on solutions to nothing. Perhaps if you want refrigeration alarm panels you can find something but what about all those supposed agricultural solutions? Where are they? What about those alarm monitors? There’s basically nothing out there yet so it is no point getting excited about a data collection capability until you have something that can deal with that data. Sounds simple, doesn’t it? Forget about the glitzy examples. Apparently there is an app out there that analyses driving behaviour and rates you against your friends. Duh! Well that’s going to save industry millions …not! So then the problem is that no-one is building practical apps that are solving actual problems and so saving people money or risk. This is a bit of a problem.
In summary, the IOT industry is busy tearing itself apart to be the first out there with really poor solutions. The customers are waiting for someone to build meaningful apps and no-one wants to tell the truth about cost less they scare them all away. We’ve got a long way to go…

Geoff Schaller
@IOTRemote