Skip to main content

Sasa.Operators<T> Overhaul - Now With More Generic Operator Goodness

Sasa.Operators<T> was covered in a previous post, and was useful in its own right, but was still somewhat limited in the operators it could expose. Since it abstracted only over a single type parameter T, it exposed those operators defined only on T. For instance, addition has signature "T add(T, T)", negation is "T negate(T)", and so on.

But not all operators are defined on only a single type. For instance, System.Decimal provides addition operators that work on integers, both signed and unsigned. Sasa.Operators<T> couldn't handle that. It can now.

Sasa.Operators is now a fully generic operator framework, exposing static generic class types Operators<T> as before, Operators<T0, T1> which exposes operators defined over two type parameters, like equals whose signature is "bool equals(T0, T1)", and finally, Operators<T0,T1,T2> for operators defined over three possible types, like addition "T2 add(T0, T1)". Furthermore, all overloadable operators are now accessible, including True/False and explicit and implicit conversions.

The delegates accessible via the above Operators classes are also now more efficient, and don't rely on LINQ expression compilation. This was all possible due to a new function I introduced under Sasa.Func called "Operator". It takes a delegate signature as a type parameter, and a Sasa.Operator value designating the operator type, and it searches for that static operator method that matches the delegate signature needed. Then it creates a direct delegate to that method of the given signature, so invocation via Operators is simply a direct virtual dispatch into a static method.

The only exception is for primitive types, like Int32 and Int64, because they don't provide static method operators. When the arguments are all primitives and no operator method is available, a small stub function is dynamically generated that implements the operation in efficient bytecode. Can't get much faster than this.

All of this was precipitated by my implementation of a LINQ expression interpreter in the Sasa.Linq assembly. You can now easily evaluate almost any LINQ expression with a single call:

var z = CLR.Eval(() => 3 * 2.0 + 1); // z = 7.0
var x = CLR.Eval(() => new Func<int, int>(z => z + 1)(3)); // x=4
...

This is in alpha status obviously, but it passes a few tests, and more will come. Obviously LINQ expressions already have compilation built-in, but sometimes dynamic compilation either isn't available, or is too costly to perform. For instance, consider the case of compiling a LINQ to SQL query. You don't want to dynamically generate code every time you want to simplify a LINQ expression. An interpreter is the right choice here due to its much smaller overhead.

Comments

David said…
Very cool! Was trying to build a version of this when I was in grad school, and ran into the expression problem, which you'd solved with some excellent visitor-fu.

Can you comment on which platforms and profiles are supported in Sasa? Your comments on dynamic compilation had me wondering if this would work with things like portable libraries, Silverlight, or the Xamarin iOS AOT compiler...or even MS's own pre-JITer (forget the name).
Sandro Magi said…
Dynamic compilation is only used for operators involving only CLR primitives, because primitives don't have operators. Big mistake on MS's part IMO. Objects that have actual operators simply create delegates, so those should safe to use in any context.

I haven't tested against other profiles though, other than client profiles. Dynamic compilation is certainly a concern in some of those other scenarios, but in principle all the necessary overloads could be pre-compiled. All I can say is, give it a try!

Popular posts from this blog

async.h - asynchronous, stackless subroutines in C

The async/await idiom is becoming increasingly popular. The first widely used language to include it was C#, and it has now spread into JavaScript and Rust. Now C/C++ programmers don't have to feel left out, because async.h is a header-only library that brings async/await to C! Features: It's 100% portable C. It requires very little state (2 bytes). It's not dependent on an OS. It's a bit simpler to understand than protothreads because the async state is caller-saved rather than callee-saved. #include "async.h" struct async pt; struct timer timer; async example(struct async *pt) { async_begin(pt); while(1) { if(initiate_io()) { timer_start(&timer); await(io_completed() || timer_expired(&timer)); read_data(); } } async_end; } This library is basically a modified version of the idioms found in the Protothreads library by Adam Dunkels, so it's not truly ground bre

Building a Query DSL in C#

I recently built a REST API prototype where one of the endpoints accepted a string representing a filter to apply to a set of results. For instance, for entities with named properties "Foo" and "Bar", a string like "(Foo = 'some string') or (Bar > 99)" would filter out the results where either Bar is less than or equal to 99, or Foo is not "some string". This would translate pretty straightforwardly into a SQL query, but as a masochist I was set on using Google Datastore as the backend, which unfortunately has a limited filtering API : It does not support disjunctions, ie. "OR" clauses. It does not support filtering using inequalities on more than one property. It does not support a not-equal operation. So in this post, I will describe the design which achieves the following goals: A backend-agnostic querying API supporting arbitrary clauses, conjunctions ("AND"), and disjunctions ("OR"). Implemen

Easy Automatic Differentiation in C#

I've recently been researching optimization and automatic differentiation (AD) , and decided to take a crack at distilling its essence in C#. Note that automatic differentiation (AD) is different than numerical differentiation . Math.NET already provides excellent support for numerical differentiation . C# doesn't seem to have many options for automatic differentiation, consisting mainly of an F# library with an interop layer, or paid libraries . Neither of these are suitable for learning how AD works. So here's a simple C# implementation of AD that relies on only two things: C#'s operator overloading, and arrays to represent the derivatives, which I think makes it pretty easy to understand. It's not particularly efficient, but it's simple! See the "Optimizations" section at the end if you want a very efficient specialization of this technique. What is Automatic Differentiation? Simply put, automatic differentiation is a technique for calcu