I have written an application to balance the national power generation schedule for a portfolio of power stations to a trading position for an energy company. The client and server components were in C# but the calculation engine was written in F#.
The use of F# to address the complexity at the heart of this application clearly demonstrates a sweet spot for the language within enterprise software, namely algorithmically complex analysis of large data sets. My experience has been a very positive one. In particular:
Units of measure The industry I work in is littered with units. The equations I implemented (often of a geometric nature) dealt with units of time, power and energy. Having the type system verify the correctness of the units of the inputs and outputs of functions is a huge time saver, both in terms of testing and reading/understanding the code. It eradicates a whole class of errors that previous systems were prone to.
Exploratory programming Working with script files and the REPL (F# Interactive) allowed me to explore the solution space more effectively before committing to an implementation than the more traditional edit/compile/run/test loop. It is a very natural way for a programmer to build their understanding of the problem and the design tensions in play.
Unit testing Code written using non-side effecting functions and immutable data structures is a joy to test. There are no complex time-dependent interactions to screw things up or large sets of dependencies to be mocked.
Interoperation I defined the interface to the calculation engine in C# and implemented the calculation in F#. The calculation engine could then be injected into any C# module that needed to use it without any concerns at all about interoperability. Seamless. The C# programmer need never know.
Code reduction Much of the data fed into the calculation engine was in the form of vectors and matrices. Higher order functions eat these for breakfast with minimal fuss, minimal code. Beautiful.
Lack of bugs Functional programming can feel strange. I can be working on an algorithm, trying hard to get the code to pass the type checker but once the type checker is satisfied thats it, it works. Its almost binary, either it wont compile or its correct. Weird edge case errors are minimised, recursion and higher order functions remove a lot of book-keeping code that introduces edge case errors.
Parallelism The functional purity of the resulting implementation makes it ripe for exploiting the inherent parallelism in processing vectors of data. Maybe this is where I will go next now that .NET 4 is out.