Saturday, January 21, 2012

Meta-Trader - Neural Nets - Part 2

Welcome back Meta-Traders.

Here's part 2 of my series on my efforts to implement an Expert Advisor in Meta-Trader that uses neural network technology.

The easiest way to build neural networks is using Fann - which is a free, open source Neural network library that you can download for free here. Use of the library is via an API (Application Programming Interface) which has about a dozen function calls. There are routines to create, train, run and destroy networks. There are also routines to load and save networks to and from the disk.

My first thought was to build a sub-system to do all the neural network stuff outside of MetaTrader. This way I could build and test the network outside of MetaTrader and poke and prod them to see how they work. This would give me some transparency into the network operation and address one of my issues with Sunqu - that it creates trains, executes and disposes dozens of networks completely in memory and never saves them to disk or leaves them around for outside examination.

Using Microsoft Visual Studio 2008, I created FannUtil.exe, a command-line program with interface as follows:

Usage:
FannUtil.exe []

Training:
FannUtil.exe -Train Fxnetdata.data fxnetdata.net

Execution:
FannUtil.exe -Execute Inputs.data fxnetdata.net OutputFile.data

Returns 0 if success else 1

This was promising since I could create, train and execute networks with only a few commands.
I'll leave the specifics of the data file formats for you to look up, but they are all text files and can be edited with notepad or whatever text editor you have available. Within a few hours, I had FannUtil.exe working like a champ on the XOR (Exclusive Or) example.

Next step was to modify a MetaTrader expert to write training data to a text file so I could run them outside of Meta-Trader and see how well the networks would converge on a solution using the training data.

Neural networks may or may not find a solution to solve for the data they are trained on. For example, the data may not contain a discernible pattern or may contain contradictory information. The degree of success of the training is measured by the MSE or Mean Standard Error for the network being trained. The default "desired error" in the examples is 0.001 which is a very small error and indicative of a problem the can be solved with a high degree of certainty and a low degree of error - such as the XOR problem.

To see if your problem has a solution, run it through the network and see if you converge on a solution. I coded FannUtil such that if the network went through 2 successive training iterations without reducing the error rate by 0.001, then stop training and report the MSE. This way we can determine very quickly what is the lowest expected error rate for the problem being presented without a lot of excess CPU cycles.

Using FannUtil, I ran some data files using the problem described in my previous post and I found the following error rates:
  • 0.045 - Error rate for 45 days training data
  • 0.069 - Error rate for 1 year training data
  • 0.077 - Error rate for 10 years training data
These error rates seem pretty high versus the target of 0.001. Clearly, the more data we present, the higher error rate will result. But is an error rate of 7% really so bad? After all, we are are looking only for very narrow edge in forex trading and 7% error might still results in a tradable edge.

Anyway, once I had Meta-Trader writing the training file, and FannUtil.exe training the networks and making predictions, I figured it would be pretty easy to modify the expert to launch FannUtil.exe and interface with the networks completely via the EXE. That lead to another series of challenges that I did not expect, and we will cover those in my next post.

I uploaded FannUtil.exe and the training files from the cases above to my yahoo group at FX-Mon Yahoo Group in case anyone wants to play around with it.

Come back later for installment #3 and have a great week.


No comments:

Post a Comment