Adding an LED Alert to WavesBlock Full Node Monitor

Whilst I have found the WavesBlock monitor useful for checking on my Full Node (See original WavesBlock post here) And I often give it a quick glance to check everything is fine, sometimes when I'm busy and not paying much attention to it, It would be easy to miss when there's an issue.

I decided to add a small LED to the block that can simply switch on when there's something that needs reviewing, for example, Disk usage high, Server not responding, or even positive alerts like on mining a block.

The feature is divided up into 3 tasks; - Add an LED to the case and connect it to the NodeMcu - Modify the server page to return an alert status - Modify the NodeMcu code to switch on the LED when theres an alert, or connectivity issues

Here's how it turned out, and below I'll describe how to implement it.


Step 1 - Add the LED I drilled a small hole in the case and glued in an LED, connecting the + pin (The longest pin) to pin 7 on the node MCU, and the ground to GND.


Step 2 - Modify the server code I modified the server response to contain a boolean flag named 'alert'. Now, it's up to you what you want to alert for here, I opted for High Disk, Mem, CPU, and when the server is not mining. It's easy enough to add to this in future, I initially set it to true, so I can test the LED.

if($disk > 80 || $ram > 80 || $cpupc > 90 || $debugStatus['minerState'] != 'mining blocks') {
  $alert = true;

Step 3 - Modify the NodeMCU code. Here we need to check the alert flag, and set the pin we attached the LED to (Pin 7 in this case) to HIGH. Because this code runs every 60 seconds, I'm happy that the LED will light for a full cycle, should there be an issue, and if its resolved by the next cycle, it will go out. I also lit it if there's a HTTP connection error. Additionally, we set up the pin in init.lua.

If all is well, and you have set alert to true in the PHP script, your LED should light up on the next cycle. Now, it's time to try and max out your CPU and test it for real!

Hope this was useful, and if you have an questions feel free to comment.


Posted on - Rewarding Open Source Contributors

Setting Up a Waves Platform Full Node on Digital Ocean

ODI SDK 12c Creating Master Repository – Solving Errors

Initially, I followed various guides about adding the jar files to the build path, and used the JavaDoc documentation to write code to create a repository. Right away I got 1 error;

Exception in thread “main” java.lang.IllegalArgumentException: Could not load JDBC driver class [weblogic.jdbc.sqlserver.SQLServerDriver]Exception in thread “main” java.lang.IllegalArgumentException: Could not load JDBC driver class [weblogic.jdbc.sqlserver.SQLServerDriver] at oracle.odi.jdbc.datasource.DriverManagerDataSource.setDriverClassName( at oracle.odi.jdbc.datasource.DriverManagerDataSource.<init>( at<init>(

Fair enough, I guess I have to add a Jar for the connection type I’m using (In this case MS Sql Server) Though, google didn’t throw up much, Seems to be a fairly well protected driver.. Anyway, I must have it locally or at least on the ODI agent. In the end I used the JDBC driver exported from the WebLogic server (wlsqlserver.jar) which I found in “C:\Oracle\Middleware\Oracle_Home\oracle_common\modules\datadirect”

Seemed to do the trick, until the next error;

Error while updating Schema Version Registry Entry for ODI. Check if user has DBA permissions

This error was actually a bit misleading, whilst there obviously was an error updating the Schema Version Registry, it had nothing to do with DBA permissions, and also it was a complete failure to connect to the database, rather than updating the schema version. This was the exception message. At this point I was not printing a stack trace and only did that after exhausting other investigations. The stacktrace revealed a much more helpful error message;

java.sql.SQLException: [FMWGEN][SQLServer JDBC Driver]This driver is locked for use with embedded applications.

Interesting, and again a little bit misleading, I finally found that I also had to add 2 more Jar’s to the build path. Simply adding ‘weblogic.jar’ and ‘wlclient.jar’ to the build path, from the location C:\Oracle\Middleware\Oracle_Home\wlserver\server\lib resolved this error.

From this point I was able to successfully create a master repository;

Master repository creation started
Oct 03, 2017 11:48:57 AM oracle.odi.internal.util.OdiLogger info
INFO: New data source: [ODI_MASTER/*******@jdbc:weblogic:sqlserver://SERVER:1433;Databasename=ODI_MASTER]
Oct 03, 2017 11:49:03 AM oracle.ucp.common.UniversalConnectionPoolBase initInactiveConnectionTimeoutTimer
INFO: inactive connection timeout timer scheduled
Master repository creation successful
Elapsed Time in second : 1433

Unit Testing Stored Procedure ETL SQL Server


In many application development teams, unit testing is a non negotiable, but the SQL world still seems a long way behind when it comes to unit testing. Regardless of platform, when you have code, and that code is manipulating data, unit testing offers the same benefits as it does in the application world.

When you have lots of complex Stored Procedures, making one minor change to a table may affect other parts of the code that you were not expecting, unit testing gives you the confidence that all is well. After merging code from a development server into a test or live environment, running the unit test suite can give you confidence of a successful deployment. You can create static code analysis unit tests to monitor for occurrences of things that could cause problems, for example if your system is based on UTC dates everywhere, you could check for the existence of GetDate() as opposed to GetUtcDate() Like the above, you could also test code quality, for example, that your index names follow a particular convention. New developers on a project can get started much quicker due to the extra confidence the unit tests give them that they have not broken anything, and also the tests assist in understanding the intent of the code they are testing It makes it possible to develop in a TDD style, and also allows one to develop against a test data set created for the purpose, rather than relying on whatever data happens to be in your development environment, which may or may not exercise all conditions it needs to. Developing against unit tests drives a good specification and requirements, and asking important questions earlier on in the development process So, if all this makes sense, why are so many places not doing this?

I think its down to a few reasons;

Developer Experience

Where the application world, whether its Java, .NET, JavaScript or other, have a variety of unit testing tools, methodologies, reporting tools as well as lots of information online around best practices, and tutorials, it’s very limited in the SQL world. There are only a few testing frameworks, and most of these have limitations

Functionality Isolation

Since most ETL database code will be manipulating data in tables and moving it from one place to another, its common that development is done in a shared development environment with existing data from a test source system, to feed the procs with test data and ensure a test is re-runnable and not fragile, the data needs isolating. This is difficult when the product, in this case SQL server does not have the concepts one would use in the application world to achieve this, for example, dependancy injection. It is however, possible to do this using the tSQLt framework, by using the FakeTable functionality, which essentially copies your table into a new blank table that you can insert data into. The tests all run as apart of a transaction, thus any changes you make during the test execution are all rolled back.

Developer Exposure

Many ETL and SQL developers don’t have a background in the application development world, and may have never been exposed to unit testing, and therefore may not understand the benefits it brings, as well as how to actually write a good unit test. Often this requires time and experience.

What to test?

So your Proc takes some data from table A, and puts a selection of its columns into table B.. What could possible go wrong?

Whilst some procedures look simple on face value, There’s a bit more than meets the eye, and ETL often does fail. If we have some actual business logic (A Transformation) it may be fairly obvious what to test here, but sometimes we are simply moving data. However, its still a possibility that we have made assumptions about the source data, or joins, and it would be good to cover that with a unit test.

Even if the unit test acts simple as a smoke test, and simply runs the proc, and it fails because a DDL change was made to a source or target table, this is still offering value and allows you to easily verify wether a change you made had any knock on effects elsewhere.


Whilst there are unit testing frameworks out there, often they run in the users IDE, and offer no shared view of results. Its possible to output tSQLt results as a JUnit XML file, which you could then export to a reporting system.

Where to run?

Often teams work on a shared server where changes are all made on a central database, and unit testing may be affected by what another developer is currently working on. It would be good to be able to run tests in a completely isolated environment. This way Developer A can confirm his tests are working, and developer B the same. When it comes to both developers merging their code there could still be failures.


I like to run Version Control on database code (More on that in another article) but this gives us a point at which we can run unit tests, when a developer commits code.

Imagine we have a group of developers working on a project on a shared development server, and once their features are committed, the code from the source control develop branch is used to build a database, and the tests are run on that database, and a nice report is sent out with unit testing results.

This gives us a chance to see how that code affected the project as a whole, for example identify any failures resulting from that code. It makes sense to run these tests outside of the shared development server, as there could be none functional uncommitted code there.

I like the tSQLt framework, its simple to use, and easy for SQL developers to get into since it uses Stored Procedures as tests, and Schemas as test classes, and allows some nice isolation options.

One can use a CI tool to execute the tests, such as Jenkins, and produce a nice unit test report that automatically emails out to the team (As well as identify which developer broke the tests and therefore owes the team a beer 🙂

I’ve written a simple utility that you can kick off from Jenkins or any kind of task scheduler, that will run tSQLt tests, and feed the output into a nice HTML report wherever you choose.

Freelancer Devops Process Tips

I come across the same basic issues time and time again, Its surprising how many setups don’t have basic processes in place. I thought I’d start to share my observations and processes I have put in place personally to improve the development process, client relationships, professionalism and personal sanity.

This article is aimed at developers who find some of the below familiar, for example, when you are progressing through a project and you have started to deliver testing builds, client feedback starts to come in;

“Hello, I’ve have a look at the app, and here’s a few issues I have found, I’ll list them below.. ”

“Dear Mr Client, I have addressed your issues and redeployed the app, please let me know if you have any other issues”

“Thanks, I still seem to be getting the same issues. Also, I have added a few extra issues I spotted below”

“Apologies, I have redeployed the app. See my responses below in Red”

“Thanks, All good. I have added my responses in Blue”

You get the idea, perhaps this seems familiar, and weeks later there’s many different email threads, perhaps a very colorful excel document once the emails got out of hand, some lost changes, outstanding issues, 10 folders of code, config files everywhere.

“This bug seems to be back again, it worked in the last version but its not working again”

I’ve worked at this end of the spectrum, a lot, as well as large enterprise waterfall PMO controlled projects, and all the bits in between.. But for when I’m calling the shots, here’s some of the processes I put in place;

Simple Task/Feature/Bug tracking

I usually use Trello, with a few simple lanes (For example; To Do, In Progress, Ready to Test, Closed) and labels to separate system areas (For example; App, Backend) and find clients understand it pretty well, and within a few days are happy to raise bugs this way. Discussion on each task or bug is kept within that ticket.

For a progress update, a client can simply log into Trello anytime and see whats being worked on, whats ready, whats up next. Every time you modify a card, they get an email, keeping them continually informed and involved.

It’s also useful to defer tasks for later without forgetting, for example, create a lane for pre-production tasks to remember.

Note: You can get a Chrome extension to number Trello cards, this is much easier when you need to reference issues over the phone with a client for example

Unit Testing / Regression

If you are not unit testing, you need to have at least a basic unit test in place for regression testing. Deploying to clients for testing with recurring bugs is sloppy and unprofessional.

When you deploy a new version, you probably already have a click around to check that everything is working as expected, Why not automate at least this process to save time, improve thoroughness, and ensure you don’t forget to test some areas.

Why not send the client a test report along with the deployment to show that you have some basic checks in place.


Why not send release notes too?

If you are not using source control, you can sign up at BitBucket (Unlimited private personal repositories)

As well as the obvious benefits, You can tag each release and generate a list of changes that are included in the build. If you commit with the Trello card number, you can reference this back to the requirement. There’s lots of plugins and integrations between Git and Trello already.

BitBucket allows you to send an email out on Commits, you could add your Clients email to this list so they see regular updates. They don’t have to understand the code, but it adds to the feeling of involvement and being updated, as well as your professionalism.

By having BitBucket set up, the client already has all the code. This may be useful for handover purposes, but worth bearing in mind should your payment structure not allow handing over any code until later.


Hopefully some of this is useful, please let me know. In the next article, I’ll cover some thoughts around automation

Agree? Disagree? I’m not a writer, nor a world class expert on what I write about. I write this blog for many reasons and welcome anyones opinions or advice. Please do leave a comment or drop me a mail with any thoughts!





  • list
  • list