Code Pyre

All Code Dies and Burns in Time

Fork me on GitHub

Chewie 2.0 Beta

| Comments

NuGet isn’t going away, so the best we can do is try to reduce the friction imposed upon us. Chewie was a project created by Eric Ridgeway in June 2011 with the goal of bringing Bundler like functionality to NuGet. While this solution has a number of shortcommings listed in my last NuGet post, I believe it is a much better solution that the vanilla NuGet experience. I rewrote Chewie based on many of Bundler’s features, and have come to a point where I believe it is ready for a public preview.

Sample .NugetFile
1
2
3
4
5
6
7
8
install_to 'lib'
chew Ninject [3.0.1.10]
chew Giles [0.1.4) dev -source "http://somethingrandom.feed.org"
chew NUnit

New-Group "Dev" {
  chew "Glimpse" "0.87"
}
Sample Commands
1
2
3
4
5
6
7
chewie init -path packages
chewie install Ninject
chewie install # installs all dependencies in the .NugetFile
chewie outdated # determines if any packages are outdated based on their version requirements
chewie update
chewie uninstall
chewie convert

Modules and Mixins for .NET: Taking Delegation to a New Level

| Comments

In the last article on prototypal inheritance using Archetype, I showed how to construct a delegation chain and simulate JavaScript’s prototypal inheritance for .NET languages. If you haven’t looked at that post, I highly recommend reading it through as I am going to skip over a lot of the theory I covered previously. Today I am taking delegation in .NET a step further. If you aren’t familiar with mixins and ruby modules, I would also recommend looking into them as what I am showing here is directly inspired by them.

When looking at prototypal inheritance for .NET, we had a very simple delegation chain. With module support we need to ammend how we evaluate and interpret expressions being called on our objects. Does the current object support the operation? Can it respond? If not, we need to loop through the prototypes (modules/mixins) that have been attached to this instance; however, as with most things in life, there is a catch. We need to process them last to first and at each level we need to start the evaluation all over again. Why last first?

When we declare a module/mixin for our class, it is taking precededence over the modules that have already been imported. This mimics the way that Ruby works. Defining a property or method that already exists redefines that member.

Revisiting the DelegatingObject (formerly PrototypalObject), I have added a new backing list to replace the single prototype from the earlier version. All existing tests and code continue to work just fine with these minimal changes. The big change is the call to creating the DynamicMetaObject with the ModuleMetaObject ctor. We are passing a collection of prototypes/modules which will be evaluated by the ModuleMetaObject which may contain many more sets of modules and delegation chains.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
public class DelegatingObject : DynamicObject {
    public DelegatingObject( params object[] modules ) {
        Modules = new List<object>( modules ?? new object[]{} );
    }

    public IList<object> Modules { get; protected set; }

    public override DynamicMetaObject GetMetaObject( Expression parameter ) {
        DynamicMetaObject baseMetaObject = base.GetMetaObject( parameter );

        if ( Modules == null || Modules.Count == 0 ) {
            return baseMetaObject;
        }

        return new ModuleMetaObject( parameter, this, baseMetaObject, Modules );
    }
}

Instead of executing a preorder traversal of the module chains, we are routing all binding calls to a new method ApplyBinding which isolates all of the resolution logic to a single method. We have an additional constraint this time in that we are binding the expression to delegation chains that may fail (whereas before we had only a single chain that could fail). When this binding failure occurs, an expression with a NodeType of ExpressionType.Throw is returned and we need to ignore these failed chains. I am also able to leverage Func<,>, Func<,,>, and closures to capture the binding calls and context arguments which makes every override an easy call to ApplyBinding.

There is one operation that has a custom implementation. The BindConvert call passes a lamdba expression which calls the Convert utility method. Casting is .NET has some interesting nuances that have to be taken into account that would break when doing the module resolution if it were done just like everything else.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
public class ModuleMetaObject : DynamicMetaObject
{
    private readonly DynamicMetaObject _BaseMetaObject;
    private readonly IList<object> _Modules;

    public ModuleMetaObject( Expression expression,
                             object value,
                             DynamicMetaObject baseMetaObject,
                             IList<object> modules )
            : base( expression, BindingRestrictions.Empty, value ) {
        _Modules = modules;
        _BaseMetaObject = baseMetaObject;
    }

    protected DynamicMetaObject BaseMetaObject { get { return _BaseMetaObject; } }

    protected IList<object> Modules { get { return _Modules; } }

    protected virtual DynamicMetaObject AddTypeRestrictions( DynamicMetaObject result, object value ) {
        BindingRestrictions typeRestrictions =
                GetTypeRestriction().Merge( result.Restrictions );
        var metaObject = new DynamicMetaObject( result.Expression, typeRestrictions, value );
        return metaObject;
    }

    protected virtual DynamicMetaObject CreateModuleMetaObject( object module ) {
        DynamicMetaObject moduleMetaObject = Create( module, Expression.Constant( module ) );
        return moduleMetaObject;
    }

    protected virtual BindingRestrictions GetTypeRestriction() {
        if ( Value == null && HasValue ) {
            return BindingRestrictions.GetInstanceRestriction( Expression, null );
        }
        return BindingRestrictions.GetTypeRestriction( Expression, LimitType );
    }

    protected Expression GetLimitedSelf() {
        return AreEquivalent( Expression.Type, LimitType )
                       ? Expression
                       : Expression.Convert( Expression, LimitType );
    }

    protected bool AreEquivalent( Type lhs, Type rhs ) {
        return lhs == rhs || lhs.IsEquivalentTo( rhs );
    }

    public override DynamicMetaObject BindBinaryOperation( BinaryOperationBinder binder, DynamicMetaObject arg ) {
        return ApplyBinding( meta => meta.BindBinaryOperation( binder, arg ),
                             ( target, errorSuggestion ) =>
                             binder.FallbackBinaryOperation( target, arg, errorSuggestion ) );
    }

    public override DynamicMetaObject BindConvert( ConvertBinder binder ) {
        return ApplyBinding( meta => Convert( binder, meta ), binder.FallbackConvert );
    }

    public override DynamicMetaObject BindCreateInstance( CreateInstanceBinder binder, DynamicMetaObject[] args ) {
        return ApplyBinding( meta => meta.BindCreateInstance( binder, args ),
                             ( target, errorSuggestion ) =>
                             binder.FallbackCreateInstance( target, args, errorSuggestion ) );
    }

    public override DynamicMetaObject BindDeleteIndex( DeleteIndexBinder binder, DynamicMetaObject[] indexes ) {
        return ApplyBinding( meta => meta.BindDeleteIndex( binder, indexes ),
                             ( target, errorSuggestion ) =>
                             binder.FallbackDeleteIndex( target, indexes, errorSuggestion ) );
    }

    public override DynamicMetaObject BindDeleteMember( DeleteMemberBinder binder ) {
        return ApplyBinding( meta => meta.BindDeleteMember( binder ), binder.FallbackDeleteMember );
    }

    public override DynamicMetaObject BindGetMember( GetMemberBinder binder ) {
        return ApplyBinding( meta => meta.BindGetMember( binder ), binder.FallbackGetMember );
    }

    public override DynamicMetaObject BindGetIndex( GetIndexBinder binder, DynamicMetaObject[] indexes ) {
        return ApplyBinding( meta => meta.BindGetIndex( binder, indexes ),
                             ( target, errorSuggestion ) =>
                             binder.FallbackGetIndex( target, indexes, errorSuggestion ) );
    }

    public override DynamicMetaObject BindInvokeMember( InvokeMemberBinder binder, DynamicMetaObject[] args ) {
        return ApplyBinding( meta => meta.BindInvokeMember( binder, args ),
                             ( target, errorSuggestion ) =>
                             binder.FallbackInvokeMember( target, args, errorSuggestion ) );
    }

    public override DynamicMetaObject BindInvoke( InvokeBinder binder, DynamicMetaObject[] args ) {
        return ApplyBinding( meta => meta.BindInvoke( binder, args ),
                             ( target, errorSuggestion ) => binder.FallbackInvoke( target, args, errorSuggestion ) );
    }

    public override DynamicMetaObject BindSetIndex( SetIndexBinder binder,
                                                    DynamicMetaObject[] indexes,
                                                    DynamicMetaObject value ) {
        return ApplyBinding( meta => meta.BindSetIndex( binder, indexes, value ),
                             ( target, errorSuggestion ) =>
                             binder.FallbackSetIndex( target, indexes, value, errorSuggestion ) );
    }

    public override DynamicMetaObject BindSetMember( SetMemberBinder binder, DynamicMetaObject value ) {
        return ApplyBinding( meta => meta.BindSetMember( binder, value ),
                             ( target, errorSuggestion ) =>
                             binder.FallbackSetMember( target, value, errorSuggestion ) );
    }

    public override DynamicMetaObject BindUnaryOperation( UnaryOperationBinder binder ) {
        return ApplyBinding( meta => meta.BindUnaryOperation( binder ), binder.FallbackUnaryOperation );
    }

    protected virtual DynamicMetaObject ApplyBinding( Func<DynamicMetaObject, DynamicMetaObject> bindTarget,
                                                      Func<DynamicMetaObject, DynamicMetaObject, DynamicMetaObject>
                                                              bindFallback ) {
        DynamicMetaObject errorSuggestion = ResolveModuleChain( bindTarget );
        return ( errorSuggestion == null )
            ? bindTarget( BaseMetaObject )
            : bindFallback( BaseMetaObject, errorSuggestion );
    }

    private DynamicMetaObject ResolveModuleChain( Func<DynamicMetaObject, DynamicMetaObject> bindTarget ) {
        for ( int index = Modules.Count - 1; index >= 0; index-- ) {
            object module = Modules[index];
            DynamicMetaObject metaObject = GetDynamicMetaObjectFromModule( bindTarget, module );

            if ( metaObject == null ||
                 metaObject.Expression.NodeType == ExpressionType.Throw ) {
                continue;
            }

            return metaObject;
        }
        return null;
    }

    private DynamicMetaObject GetDynamicMetaObjectFromModule( Func<DynamicMetaObject, DynamicMetaObject> bindTarget,
                                                              object module ) {
        DynamicMetaObject moduleMetaObject = CreateModuleMetaObject( module );
        DynamicMetaObject boundMetaObject = bindTarget( moduleMetaObject );
        DynamicMetaObject result = AddTypeRestrictions( boundMetaObject, boundMetaObject.Value );
        return result;
    }

    private static bool TryConvert( ConvertBinder binder, DynamicMetaObject instance, out DynamicMetaObject result ) {
        if ( instance.HasValue && instance.RuntimeType.IsValueType ) {
            result = instance.BindConvert( binder );
            return true;
        }

        if ( binder.Type.IsInterface ) {
            result = new DynamicMetaObject( Convert( instance.Expression, binder.Type ),
                                            BindingRestrictions.Empty,
                                            instance.Value );
            result = result.BindConvert( binder );
            return true;
        }

        if ( typeof (IDynamicMetaObjectProvider).IsAssignableFrom( instance.RuntimeType ) ) {
            result = instance.BindConvert( binder );
            return true;
        }

        result = null;
        return false;
    }

    private static DynamicMetaObject Convert( ConvertBinder binder, DynamicMetaObject instance ) {
        DynamicMetaObject result;
        return TryConvert( binder, instance, out result ) ? result : instance;
    }

    private static Expression Convert( Expression expression, Type type ) {
        return expression.Type == type ? expression : Expression.Convert( expression, type );
    }
}

One great thing about this implementation is that we can replicate prototypal inheritance completely. We just need to make sure that each module we create only has a single base module all the way through the chain.

With these two classes now in place, it is time to start having some fun. While there are many fun ways we can leverage the DelegatingObject to mix in behavior, today I am going to focus on one very specific usage.

Invariably, when talking about module support in .NET, one of the most common requests you will here people talk about is to have a INotifyPropertyChanged module. While this can now be easily done, we have an added benefit from the way that ModuleMetaObject handles casting. We can pass around our model object and it can be cast as INotifyPropertyChanged without the need to generate a runtime proxy. The cast will actually dig through the module hierarchy, find that we have a module which provides the interface, and return that object.

First, let’s create a utility interface and module that will help us out and abstract some boilerplate code.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
public interface INotifyPropertyChanges : INotifyPropertyChanged, INotifyPropertyChanging {
    void OnPropertyChanged( string propertyName = "" );
    void OnPropertyChanging( string propertyName = "" );
}

public class NotifyPropertyChangesModule : INotifyPropertyChanges {
    public event PropertyChangedEventHandler PropertyChanged;
    public event PropertyChangingEventHandler PropertyChanging;

    public virtual void OnPropertyChanged( string propertyName = "" ) {
        PropertyChangedEventHandler handler = PropertyChanged;
        if ( handler != null ) {
            handler( this, new PropertyChangedEventArgs( propertyName ) );
        }
    }

    public virtual void OnPropertyChanging( string propertyName = "" ) {
        PropertyChangingEventHandler handler = PropertyChanging;
        if ( handler != null ) {
            handler( this, new PropertyChangingEventArgs( propertyName ) );
        }
    }
}

One thing you may notice is that I did not use the CallerMemberName attribute available in .NET 4.5. When using modules, the caller name will not be the originating caller you expect. Since we are building expression trees to call the members at runtime, you cannot rely on this new feature.

With a module in place, we can now create a model which can have the functionality weaved in:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
public class Person : DelegatingObject {
    private string _Name;

    public Person()
            : this( new NotifyPropertyChangesModule() ) { }

    public Person( params object[] modules )
            : base( modules ) { }

    public string Name {
        get { return _Name; }
        set {
            This.OnPropertyChanging( "Name" );
            if ( _Name != value ) {
                _Name = value;
                This.OnPropertyChanged( "Name" );
            }
        }
    }

    private dynamic This { get { return this; } }
}

The implementation is pretty simple. The thing to remember is that in order for the module chain to trigger, whether you are in the object or acting on the model object, the invocation must take place on a dynamic object. I have provided a simple property This which casts the model object to dynamic allowing us to access the latebound method calls.

There are a few different ways we can work with this object now:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
public void UsingModelObjectAsDynamic() {
    dynamic person = new Person();
    // The cast to the interface will work returning the inner module
    INotifyPropertyChanges inpc = person;
    inpc.PropertyChanged +=
            ( sender, args ) => Console.WriteLine( "The field {0} has changed.", args.PropertyName );
    inpc.PropertyChanging +=
            ( sender, args ) => Console.WriteLine( "The field {0} is changing.", args.PropertyName );
    // We have full IntelliSense when working with inpc,
    // but now accessing person.Name looses IntelliSense
    person.Name = "Inigo Montoya"; // trigger the events
}

public void UsingModelObjectAsStronglyTyped() {
    Person person = new Person();
    // Casting first to dynamic triggers the DelegatingObject's casting system
    INotifyPropertyChanges inpc = (dynamic) person;
    inpc.PropertyChanged +=
            ( sender, args ) => Console.WriteLine( "The field {0} has changed.", args.PropertyName );
    inpc.PropertyChanging +=
            ( sender, args ) => Console.WriteLine( "The field {0} is changing.", args.PropertyName );
    // We have full IntelliSense when working with person.
    person.Name = "Inigo Montoya"; // trigger the events
}

Sometimes, we may want to handle accessing the interface provided by a module. Again leveraging the dynamic casting, we can create a new property which will provide us access to the interface. I have also created a new Age property which takes advantage of this feature and now gives us IntelliSense when acting on the INotifyPropertyChanges feature.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
private int _Age;

public int Age {
    get { return _Age; }
    set {
        Inpc.OnPropertyChanging( "Age" );
        if ( _Age != value ) {
            _Age = value;
            Inpc.OnPropertyChanged( "Age" );
        }
    }
}

internal INotifyPropertyChanges Inpc { get { return This; } }

We can also leverage this property to provide external access to our mixed in behavior:

1
2
3
4
5
6
7
8
9
10
public void UsingModelWithProxyCastingProperty() {
    Person person = new Person();
    // The cast property to give us IntelliSense
    person.Inpc.PropertyChanged +=
            ( sender, args ) => Console.WriteLine( "The field {0} has changed.", args.PropertyName );
    person.Inpc.PropertyChanging +=
            ( sender, args ) => Console.WriteLine( "The field {0} is changing.", args.PropertyName );
    // We also still have IntelliSense on Name
    person.Name = "Inigo Montoya"; // trigger the events
}

So far, each model has received its own instance of a module, but that is by no means the only usage pattern. Object’s can share modules and behavior:

1
2
3
4
5
6
7
8
9
10
11
12
13
public void ShareModulesToShareBehavior() {
    var module = new NotifyPropertyChangesModule();
    module.PropertyChanged +=
            ( sender, args ) => Console.WriteLine( "The field {0} has changed.", args.PropertyName );
    module.PropertyChanging +=
            ( sender, args ) => Console.WriteLine( "The field {0} is changing.", args.PropertyName );
    Person inigo = new Person( module ) { Name = "Inigo" };
    Person ian = new Person( module ) { Name = "Ian" };
    // Four events triggered by setting the names, 2x Changed, 2x Changing
    inigo.Age = 35;
    ian.Age = 30;
    // The module is now acting somewhat like a message channel in a message broker
}

This barely scratches the surface of what is possible, but hopefully it gives you a taste.

Another simple pattern you can apply is adding an ExpandoObject to the beginning of your module chain (thus it is the last resolved). This will give you a fallback behavior of the Expando should all else fail. You can also create a system to handle an analog to method_missing by overriding the TryXyz members you get when deriving from DelegatingObject (via DynamicObject). I playing around with a sample implementation right now which will also delegate to static members.

If you find this interesting or have ideas for other modules, I’d love to hear about them.

NuGet: You’re Doing It Wrong

| Comments

NuGet is the new (well, relatively new) hotness. Everyone loves NuGet. I am even guilty of extolling some of its virtues. I am not going to talk about the things which NuGet does well as they have been documented over and over again, and I don’t really have anything to add to those lists of awesome features. I can’t, however, advocate the way the owners of NuGet have taken the product.

Microsoft, along with most of the industry, went from using build domain specific languages (DSL) to drowning in XML. MSBuild, Ant, NAnt, XBuild, etc are all anathemas to our profession. Don’t get me wrong, they served their purpose, but we have to learn from those mistakes and move toward build systems based on coded build scripts using tools like Rake and Albacore, jake, and psake.

Leveraging existing programming languages, we can utilize their capabilities in creating DSLs to run our builds. Funny as this is, we have now come full circle, but with a better understanding of what a build should be.

Why bring this up? This applies directly to NuGet’s exhaustive use of XML. In addition, we have NuGet’s incredibly limited command-line outside of Visual Studio, and the choices to disempower users with unneeded hand holding. There is an incredible focus to force users to manage their dependencies inside of Visual Studio with a number of features, such as the PowerShell tool scripts, only working when you have the solution loaded.

Ruby

Ruby has had a package management system, gems, and a dependency bundling tool, bundler, for years. Gem specs are ruby code veiled in a thin DSL. You can see examples in the rspec gemspec and Albacore gemspec . They were made simple, leveraging the language and incredibly powerful.

Bundler scripts are ruby code veiled in a thin DSL which defines the external dependencies in one place (in addition it has a GemFile.lock which ensures other developers will use the same packages that you are based on all of your packages’ dependency versioning requirements). Again, we can look to the elegant simplicity of the Albacore Gemfile for an example.

The ruby community has fully embraced their language, package management system, and made them first class citizens. When you clone a repo, you execute ‘bundle install’ and all of your dependencies are pulled down. You do this once. From then on you can check for updates, verify that packages are up-to-date, and all other dependency lifecycle tasks. You can also manage your gem projects with the Jewler gem; it has some very enticing features (if you looked at the Albacore gemspec, you likely saw that it is generated by Jewler).

What about NuGet?

NuGet defines its specs with XML. And to be all great and powerful XML, it is schema validated. Ninety-three percent of the first three lines of a package are copy & paste noise. This is a waste of time and hides the intent of the spec.

1
2
3
<?xml version="1.0"?>
<package xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:xsd="http://www.w3.org/2001/XMLSchema">
  <metadata xmlns="http://schemas.microsoft.com/packaging/2010/07/nuspec.xsd">

NuGet defines its dependencies with XML and manages its local package configurations with a repositories.config file and many, many packages.config files. Each of these packages.config files are stored in each project folder. If you use NuGet, the way it is designed to work, you have no way of seeing what dependencies your application actually has without loading your solution into VS and making sure that all of these config files are checked in across your codebase.

Examples

Let’s start by taking a look at a simple spec definition.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
<?xml version="1.0"?>
<package xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:xsd="http://www.w3.org/2001/XMLSchema">
  <metadata xmlns="http://schemas.microsoft.com/packaging/2010/07/nuspec.xsd">
    <id>Ninject</id>
    <version></version>
    <authors>Ninject Project Contributors</authors>
    <requireLicenseAcceptance>true</requireLicenseAcceptance>
  <licenseUrl>https://github.com/ninject/ninject/raw/master/LICENSE.txt</licenseUrl>
  <summary>IoC container for .NET</summary>
    <language>en-US</language>
  <tags>Ninject ioc di</tags>
  <iconUrl>https://github.com/ninject/ninject/raw/master/logos/Ninject-Logo32.png</iconUrl>
  <projectUrl>http://www.ninject.org</projectUrl>
  </metadata>
</package>

Pulling directly from Giles rakefile which uses Albacore to generate the XML spec:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
desc "Create the nuspec"
nuspec :createSpec => :prepPackage do |nuspec|
    nuspec.id = "Giles"
    nuspec.version = @GilesVersion
    nuspec.authors = "Jeff Schumacher (@codereflection)"
    nuspec.owners = "Jeff Schumacher (@codereflection)"
    nuspec.description = "Giles - continuous test runner for .NET applications."
    nuspec.summary = "Currently supports Machine.Specifications (mspec), NUnit, xUnit.net, and NSpec"
    nuspec.language = "en-US"
    nuspec.projectUrl = "http://testergiles.herokuapp.com/"
    nuspec.title = "Giles, Rupert Giles, at your service!"
    nuspec.tags = "testrunner test unittest giles"
    nuspec.output_file = "Giles.nuspec"
    nuspec.working_directory = "deploy/package"
    nuspec.licenseUrl = "https://github.com/codereflection/Giles/blob/master/License.txt"
end

desc "Create the nuspec package"
nugetpack :createPackage do |nugetpack|
    nugetpack.nuspec = "deploy/package/Giles.nuspec"
    nugetpack.base_folder = "deploy/package"
    nugetpack.output = "deploy"
end

We can replicate this with a simple PowerShell script where we give the user a simple object which properties are attached.

1
2
3
4
5
6
7
8
9
10
11
param($spec)
$spec.id = Ninject
$spec.version = 3.1.0.8
$spec.authors = @(Ninject Project Contributors)
$spec.requireLicenseAcceptance = $true
$spec.licenseUrl = https://github.com/ninject/ninject/raw/master/LICENSE.txt
$spec.summary = IoC container for .NET
$spec.language = en-US
$spec.tags = @(Ninject, ioc, di)
$spec.iconUrl = https://github.com/ninject/ninject/raw/master/logos/Ninject-Logo32.png
$spec.projectUrl = http://www.ninject.org

There are additional properties such as dependencies, frameworkAssemblies, and files. With a scripted definition, these nodes are very simple, and we can actually use file sets and collections to specify values instead of wildcards and long lists.

In the Beginning

Perhaps the craziest part is that NuGet started with Nubular (Nu), which was implemented via ruby gems. Let that sink in for a moment. The package management system for .NET was once ruby based. Here is an original package definition for AutoFac

1
2
3
4
5
6
7
8
9
10
11
12
13
version = File.read(File.expand_path("../VERSION", __FILE__)).strip
Gem::Specification.new do |spec|
  spec.platform  = Gem::Platform::RUBY
  spec.name      = 'autofac'
  spec.version   = version
  spec.files = Dir['lib/**/*'] + Dir['docs/**/*']
  spec.summary   = 'Autofac - An addictive .NET IoC container'
  spec.description = 'Autofac is an IoC container for Microsoft .NET. It manages the dependencies between classes so that applications stay easy to change as they grow in size and complexity. This is achieved by treating regular .NET classes as components.'
  spec.authors           = ['Nicholas Blumhardt','Rinat Abdulin']
  spec.email             = 'emailme@bilsimser.com'
  spec.homepage          = 'http://code.google.com/p/autofac/'
  spec.rubyforge_project = 'autofac'
end

If you go to the Nu website, you’ll see this at the top:

February 2011 - The collaborative effort of the Nubular team and Microsoft has produced NuGet.

I wasn’t involved, but it seems to me that collaborative is a synonym for neutering. They took a thriving ecosystem with working solution and then stepped back years technologically. Microsoft couldn’t embrace Ruby; especially with the way it abandoned IronRuby. NuGet became the TFS of package management: highly integrated and doing nothing well.

Package Restore

I traveled to Portland and paired up with another developer to try to cut some code. We pulled down a repo, and then proceeded to trip all over ourselves…err, NuGet…for quite a while. The project we were dealing with, unsurprisingly, relied on NuGet package restore.

We were surprised at every step dealing with packages hooking into the csproj targets, build steps, and tooling incompatibilities. In the end, we changed frameworks and were finally able to rip out NuGet’s parasitic attachment to our project.

NuGet package restore is a feature in which NuGet embeds itself into your project files

1
2
3
4
5
6
7
<project>
  <propertyGroup>
    <RestorePackages>true</RestorePackages>
  </propertyGroup>
  <!-- your project contents-->
  <Import Project="$(SolutionDir)\.nuget\nuget.targets" />
</project>

It also creates a .nuget folder at your solution root which contains NuGet.config, NuGet.exe, and NuGet.targets.

When you pull down a project/solution that uses package restore, and then try to build it, you get an error (well, many errors most likely, but they are the same). NuGet, through its targets hook, will call:

1
2
"$(SolutionDir).nuget\nuget.exe" install "C:\<PathTo>\packages.config" -source ""
      -RequireConsent -o "$(SolutionDir)\packages"

Where <PathTo> is changed for every packages.config littered in your codebase.

1
2
3
4
5
C:\Dev\SomeProject\.nuget\NuGet.targets(80,9): error : Package restore is disabled by default.
To give consent, open the Visual Studio Options dialog, click on Package Manager node and
check 'Allow NuGet to download missing packages during build.' You can also give consent by
setting the environment variable 'EnableNuGetPackageRestore' to 'true'. [C:\Dev\SomeProject\src\SomeProject\SomeProject.csproj]
C:\Dev\SomeProject\.nuget\NuGet.targets(80,9): error : One or more errors occurred. [C:\Dev\SomeProject\src\SomeProject\SomeProject.csproj]

So, now I need to either open Visual Studio and find the setting they list, or I need to create an environment variable and set the value to true. Now, we have to configure package restore on each developer’s machine, plus the build server to deal with this requirement.

There is something more insidious hiding inside your build due to package restore. When you run your build, it is not a one-time install of your application’s dependencies. NuGet will run a package install for every single package in every project that has a packages.config file. Let’s look at the log output from building a solution with package restore:

1
2
3
4
5
6
> msbuild SomeProject.sln
...
RestorePackages:
  "C:\dev\SomeProject\.nuget\nuget.exe" install "C:\dev\SomeProject\src\SomeProject\packages.config
  " -source ""  -RequireConsent -solutionDir "C:\dev\SomeProject\ "
  All packages listed in packages.config are already installed.

The build will always run the nuget.exe install call for every build. Every build you have is now slower, and will be slower still for each project you add that makes this redundant call (nuget will detect that the packge is already installed, but it is still slowing your build and filling your logs with garbage).

NuGet is also dealing with privacy issues related to package restore. Had the NuGet team stuck with Nu’s original implementation, there would have never been a need for package restore with everything handled with Bundler. Ruby’s gem system isn’t the greatest in the world, and there are growing numbers of people that are talking about trying something else, but they are still years ahead of the NuGet.

NuGet.exe

NuGet on the command line is a shadow of its Package Manager Console self. NuGet.exe:

  • Can’t update a package, let alone all, without being given a packages.config file. (I hear this is being fixed)
  • Can’t remove a package, at all.
  • Does not run init.ps1, install.ps1, or uninstall.ps1 leaving your packages broken when used outside of Visual Studio. You don’t have a DTE instance, so even if the scripts were executed, they wouldn’t work. Package installation has been designed to force a user into Visual Studio to install a dependency.
  • With the config parameter: gets or sets NuGet config values - except that you don’t know what keys exist so you can’t see the config values.
  • When using packages.config will get the exact version in the file no matter what.
  • Doesn’t get dependencies at all when using packages.config; they are expected to be in the file already.

You can’t specify a compatible set of versions in your packages.config files should one or more of your dependencies require a different, but still compatible version of a shared dependency. I can hear you objecting with “you can use allowedVersions”:

1
<package id="log4net" version="1.2.10" allowedVersions="[1.2.10]" />

This however, doesn’t work; it is broken by design. Using allowedVersions means you’d have to track down every packages.config file and manually edit it to add this attribute. Even worse, NuGet looks at each packages.config individually so it can’t see version compatibility issues with allowedVersions.

NuGet appends the version of the library to the name when creating a folder in the Packages folder. This means that any updates require you to update every one of your project files with a new file path. This is a weak argument, but we should be given an easy way to override this without manually creating the calls to nuget.exe or install-package.

One item, that may be the coup de grâce of NuGet’s feature failings relates to how package sources are specified. How do you know where a particular dependency is sourced? We can see an example with xUnit. If we open the xunit NuGet.targets file:

1
2
3
4
<ItemGroup Condition=" '$(PackageSources)' == '' ">
  <PackageSource Include="https://nuget.org/api/v2/" />
  <PackageSource Include="http://www.myget.org/F/b4ff5f68eccf4f6bbfed74f055f88d8f/" />
</ItemGroup>

When you look at a dependency, you have no idea where that dependency is coming from. NuGet will have to check each feed to find where a particular package resides. What happens if the package is in both feeds? What if you want the version from your own feed to be used? This is just another example of making the tool work like magic and finding yet another place to configure NuGet.

Now, to figure out what dependencies I have, and where they come from, I have to look through three different types of XML files. Even though once I have done that, I may not know where those dependencies even come from.

The End?

Will I stop using NuGet? No. Will I be happy using it? No. Will I stop complaining? Hell no. Pretending that everything is great will never push us to do better. I’ll do everything I can to remove NuGet’s way of doing things and put in my own.

Finishing Questions

  • Do we host a new gems website and write some importer form nuget to convert packages to gems?
  • Will we write something gemlike?
  • Can we analyze gems’ and NuGet’s failings to design a better system?
  • Can we create tooling around a cross platform DSL other than XML?
  • The use of PowerShell and reliance on Visual Studio really hurts mono development on Linux using nuget. What could we use instead? Can we push Microsoft to make PowerShell OSS?
  • Can Nemerle be leveraged to create a DSL that will work for mono as well?
  • Do we rewrite nuget the correct way? Correct being first class, OS independent CLI and no XML. The CLI is the most important thing in most camps. It should be the case, but it obviously isn’t since MS took over NuGet. There is a whole camp that thinks the GUI is the most important thing; then there is MS that tries to make all free things tie into paid products somehow. They would be wrong. I understand their perspective, but it is the wrong choice. When developing free software, choose to do it right. By creating the deep integration into Visual Studio and abandoning the command line, we are forced to use Visual Studio in order to leverage NuGet, thus destroying any competition.

Prototypal Inheritance in .NET: Delegation at Last

| Comments

In .NET 4.0, we have access to the Dynamic Language Runtime (DLR) giving us dynamic dispatch via the IDynamicMetaObjectProvider interface coupled with the dynamic keyword. When a variable is declared as dynamic and it implements the IDynamicMetaObjectProvider interface, we are given the opportunity to control the delegation of calls on that object by returning a DynamicMetaObject containing an expression which will be evaluated by the runtime. We only get this opportunity if the direct target was unable to directly handle the expression.

The DynamicObject and ExpandoObject classes both implement IDynamicMetaObjectProvider, but their implementations are explicitly implemented and the nested classes used to return the DynamicMetaObject are private and sealed. I understand that the classes may not have been tested enough to make them inheritable, but not having access to these classes really hurts our ability to easily modify the behavior of the underlying DynamicMetaObject. The key word here is easily; implementing the needed expression building is a great deal of work and the internal workings of the Microsoft implementations leverage many internal framework calls.

A key thing to consider is that we don’t want to replicate classical inheritance; instead, we are going to focus on prototypal inheritance that mostly replicates JavaScript’s prototypal inheritance. Rather than trying to replicate all of the hard work that the DRL team put into writing their implementation, we can add our own implementation on top of theirs. It is simple to hook in, but we need to save off a method to access the base DynamicMetaObject implementation. This will allow us to attempt to interpret the expression on the object itself or pass it along.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
public class DelegatingPrototype : DynamicObject {
    public DelegatingPrototype( object prototype = null ) {
        Prototype = prototype;
    }

    public virtual object Prototype { get; set; }

    public override DynamicMetaObject GetMetaObject( Expression parameter ) {
        if ( Prototype == null ) {
            return GetBaseMetaObject( parameter );
        }
        return new PrototypalMetaObject( parameter, this, Prototype );
    }

    public virtual DynamicMetaObject GetBaseMetaObject( Expression parameter ) {
        return base.GetMetaObject( parameter );
    }
}

This small amount of code just sets the hook. Now we need set up the delegation expression.

To set up the prototypal hierarchy, we are going to need to do a lot of recursion. Unfortunately, it is well hidden (I’ll explain shortly). When a call is made on the root object, we are given the expression being interpreted. Using the IDynamicMetaObjectProvider overload, we will hand off the expression to the PrototypalMetaObject to construct delegation expression (or the DynamicObject implementation if our prototype is null thus trying to interpret the expression on the current object). We want to make a preorder traversal of the prototype hierarchy; at each step, the current object’s evaluation will take precedence over its prototype tree. Consider the following prototypal class hierarchy:

Since we never know which level of the hierarchy will be handling the expression, we need to build an expression for the entire tree every time. We want to get the DynamicMetaObject representing the current object’s tree first. Once done, we get the DynamicMetaObject for evaluating the expression on the current instance. With these two, we can create a new DynamicMetaObject which try to bind the expression to the current instance first, and then fallback to the prototype. At the root level, the prototype DynamicMetaObject contains the same fallback for the next two layers.

There is another caveat that we need to address. When we try to invoke and expression on an object, the expression is bound to that type. When accessing the prototype, if we don’t do anything, the system will throw a binding exception because the matching object won’t match the DynamicMetaObject’s type restrictions. To fix this, we need to relax the type restrictions for each prototype.

To iterate is human, to recurse is divine, to inception recurse is demented

L. Peter Deutsch Paraphrased

Remember the recursion I mentions earlier? In the code sample below, I have pulled out all binding code except for the BindInvokeMember method. The _metaObject.Bind[...] will actually call into DelegatingPrototype::GetMetaObject which will try to call back into _metaObject.Bind[...], which will…well you get the idea. At each call, the prototype becomes the target and we get a new prototype.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
public class PrototypalMetaObject : DynamicMetaObject {
    private readonly DynamicMetaObject _baseMetaObject;
    private readonly DynamicMetaObject _metaObject;
    private readonly DelegatingPrototype _prototypalObject;
    private readonly object _prototype;

    public PrototypalMetaObject( Expression expression, DelegatingPrototype value, object prototype )
            : base( expression, BindingRestrictions.Empty, value ) {
        _prototypalObject = value;
        _prototype = prototype;
        _metaObject = CreatePrototypeMetaObject();
        _baseMetaObject = CreateBaseMetaObject();
    }

    protected virtual DynamicMetaObject CreateBaseMetaObject() {
        return _prototypalObject.GetBaseMetaObject( Expression );
    }
  
    protected virtual DynamicMetaObject CreatePrototypeMetaObject() {
        Expression castExpression = GetLimitedSelf();
        MemberExpression memberExpression = Expression.Property( castExpression, "Prototype" );
        return Create( _prototype, memberExpression );
    }

    protected Expression GetLimitedSelf() {
        return AreEquivalent( Expression.Type, LimitType ) ? Expression : Expression.Convert( Expression, LimitType );
    }

    protected bool AreEquivalent( Type lhs, Type rhs ) {
        return lhs == rhs || lhs.IsEquivalentTo( rhs );
    }
      
    protected virtual BindingRestrictions GetTypeRestriction() {
        if ( Value == null && HasValue ) {
            return BindingRestrictions.GetInstanceRestriction( Expression, null );
        }
        return BindingRestrictions.GetTypeRestriction( Expression, LimitType );
    }

    protected virtual DynamicMetaObject AddTypeRestrictions( DynamicMetaObject result ) {
        BindingRestrictions typeRestrictions = GetTypeRestriction().Merge( result.Restrictions );
        return new DynamicMetaObject( result.Expression, typeRestrictions, _metaObject.Value );
    }
  
    public override DynamicMetaObject BindInvokeMember( InvokeMemberBinder binder, DynamicMetaObject[] args ) {
        DynamicMetaObject errorSuggestion = AddTypeRestrictions( _metaObject.BindInvokeMember( binder, args ) );
        return binder.FallbackInvokeMember( _baseMetaObject, args, errorSuggestion );
    }
}

You may be thinking, ok, this is cool, but what use it is it? What is the use case? First, it’s cool. Second, it sets the foundation for .NET mixins. Third, it gives us a second form of inheritance (after parasitic) for PowerShell.

What if we take the prototype and make it a collection of prototypes? What if instead of inheriting from DelegatingPrototype we reuse the internal prototypal skeleton? If this sounds familiar, it should. I am describing ruby classes with modules and a base class, but with C#…

If you want to see more or play around with the code, you can find full implementations in the Archetype project.

Project Euler: Problem 10 in PowerShell

| Comments

The sum of the primes below 10 is 2 + 3 + 5 + 7 = 17.

Find the sum of all the primes below two million.

This problem is very simple as I can reuse the parasitic prime generation from my solution to Problem 7 which also requires Prototype.ps.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
function New-PrimeFinder {
  $prototype = New-PrimeGenerator
  $prototype | Add-Function FindPrimesLessThan {
    param($value)
    if($this.Bound -lt $value) {
      $finder.BoundIncrement = $value - $this.Bound
      $this.Expand()
    }
    return $this.Primes | ? { $_ -lt $value }
  }
  $prototype
}

function Solve-Problem10 {
  param($value = 2000000)
  $finder = New-PrimeFinder
  $finder.FindPrimesLessThan($value) | % {[long]$sum = 0} {$sum+=$_} {$sum}
}

Write-Host "Elapsed Time (s): " (Measure-Command {Solve-Problem10}).TotalSeconds
Write-Host "Solution: " (Solve-Problem10)

Elapsed Time (s):  1999.6739304
Solution:  142913828922

Project Euler: Problem 9 in PowerShell

| Comments

A Pythagorean triplet is a set of three natural numbers, a < b < c, for which,
a^2 + b^2 = c^2
For example, 32 + 42 = 9 + 16 = 25 = 52.

There exists exactly one Pythagorean triplet for which a + b + c = 1000.
Find the product abc.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
function Solve-Problem9 {
  $sum = 1000
  $found = $false
  for ($a = 1; $a -lt $sum / 3; $a++) {
    for ($b = $a; $b -lt $sum / 2; $b++) {
      $c = $sum - $a - $b

      if ($a * $a + $b * $b -eq $c * $c) {
        $found = $true
        break
      }
    }

    if ($found) { break }
  }

  Write-Host "The Pythagorean triplet is a = $a, b = $b, c = $c"
  Write-Host "The product is $($a*$b*$c)"
}

Write-Host "Elapsed Time (s): " (Measure-Command {Solve-Problem9}).TotalSeconds
The Pythagorean triplet is a = 200, b = 375, c = 425
The product is 31875000
Elapsed Time (s):  0.2247551

Project Euler: Problem 8 in PowerShell

| Comments

Find the greatest product of five consecutive digits in the 1000-digit number.

73167176531330624919225119674426574742355349194934
96983520312774506326239578318016984801869478851843
85861560789112949495459501737958331952853208805511
12540698747158523863050715693290963295227443043557
66896648950445244523161731856403098711121722383113
62229893423380308135336276614282806444486645238749
30358907296290491560440772390713810515859307960866
70172427121883998797908792274921901699720888093776
65727333001053367881220235421809751254540594752243
52584907711670556013604839586446706324415722155397
53697817977846174064955149290862569321978468622482
83972241375657056057490261407972968652414535100474
82166370484403199890008895243450658541227588666881
16427171479924442928230863465674813919123162824586
17866458359124566529476545682848912883142607690042
24219022671055626321111109370544217506941658960408
07198403850962455444362981230987879927244284909188
84580156166097919133875499200524063689912560717606
05886116467109405077541002256983155200055935729725
71636269561882670428252483600823257530420752963450

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
function Split-Number {
  param([string]$value)
  $index = 0
  while($index -lt $value.Length - 4) {
    $value.Substring($index++, 5)
  }
}

filter Evaluate-String {
  $_.ToCharArray() | % { [int]::Parse($_) } | % {$total = 1} {$total *= $_} {$total}
}

function Solve-Problem8 {
  $text = "7316717653133062491922511967442657474235534919493496983520312774506326239578318016984801869478851843858615607891129494954595017379583319528532088055111254069874715852386305071569329096329522744304355766896648950445244523161731856403098711121722383113622298934233803081353362766142828064444866452387493035890729629049156044077239071381051585930796086670172427121883998797908792274921901699720888093776657273330010533678812202354218097512545405947522435258490771167055601360483958644670632441572215539753697817977846174064955149290862569321978468622482839722413756570560574902614079729686524145351004748216637048440319989000889524345065854122758866688116427171479924442928230863465674813919123162824586178664583591245665294765456828489128831426076900422421902267105562632111110937054421750694165896040807198403850962455444362981230987879927244284909188845801561660979191338754992005240636899125607176060588611646710940507754100225698315520005593572972571636269561882670428252483600823257530420752963450"

  (Split-Number $text | Evaluate-String | Measure-Object -Maximum).Maximum
}

Write-Host "Elapsed Time (s): " (Measure-Command {Solve-Problem8}).TotalSeconds
Write-Host "Solution: " (Solve-Problem8)

Elapsed Time (s):  0.5708985
Solution:  40824

Project Euler: Problem 7 in Parasitic PowerShell

| Comments

This post leverages my OSS project Prototype.ps to support parasitic object creation. If you haven’t read the introduction posts Part 1 and Part 2, I would recommend reading them as I build off of their functionality and theory. If you don’t care how it works, read on my friend.

By listing the first six prime numbers: 2, 3, 5, 7, 11, and 13, we can see that the 6th prime is 13.

What is the 10,001st prime number?

Solving Using a Prototypal Object
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
function New-PrimeGenerator {
  $prototype = New-Prototype
  $prototype | Update-TypeName
  $prototype | Add-Property Primes @(2,3,5)
  $prototype | Add-Property Cache @{2=$true; 3=$true; 4=$false; 5=$true}
  $prototype | Add-Property Bound 5
  $prototype | Add-Property BoundIncrement 1000
  $prototype | Add-Property LastExpansionPrimeCount 3
  $prototype | Add-ScriptProperty MaxPrime {$this.Primes | select -last 1}
  $prototype | Add-Function IsPrime {
    param($value)
    while ($this.MaxPrime -lt $value) { $this.Expand() }
    return $this.Cache[$value] -ne $null
  }
  $prototype | Add-Function FindNthPrime {
    param($value)
    while ($this.Primes.Length -lt $value) { $this.Expand() }
    return $this.Primes[$value-1]
  }
  $prototype | Add-Function RoundToNextMultiple {
    param($base, $multiple)
    $remainder = $base % $multiple;
    if($remainder -eq 0) { $base }
    else { $base + $multiple - $remainder }
  }
  $prototype | Add-Function Expand {
    $oldBound = $this.Bound
    if($this.LastExpansionPrimeCount -le $this.Primes.Length) {
      $this.Bound += $this.BoundIncrement
    }
    $limit = $this.Bound
    $cache = $this.Cache
    $this.Primes | % {
      for ($i=$this.RoundToNextMultiple($oldBound,$_); $i -le $limit; $i += $_ ) {
        if(!$cache.ContainsKey($i)) { $cache[$i] = $false }
      }
    }
    ($oldBound+1)..$limit | ? { $cache[$_] -eq $null } | % {
      $cache[$_] = $true
      $this.Primes += @($_)
      for ($i=$this.RoundToNextMultiple($oldBound,$_); $i -le $limit; $i += $_ ) {
        if(!$cache.ContainsKey($i)) { $cache[$i] = $false }
      }
    }
    $this.LastExpansionPrimeCount = $this.Primes.Length
  }
  $prototype
}

function Solve-Problem7 {
  $generator = New-PrimeGenerator
  Write-Host "Elapsed Time (s): " (Measure-Command {$generator.FindNthPrime(10001)}).TotalSeconds
  Write-Host "Elapsed Time (s): " (Measure-Command {$generator.FindNthPrime(10001)}).TotalSeconds
  Write-Host "Solution: " ($generator.FindNthPrime(10001))
}

Solve-Problem7

Elapsed Time (s):  33.1381042
Elapsed Time (s):   0.0003987
Solution:  104743

Project Euler: Problem 6 in PowerShell

| Comments

The sum of the squares of the first ten natural numbers is,

1^2 + 2^2 + … + 10^2 = 385
The square of the sum of the first ten natural numbers is,

(1 + 2 + … + 10)^2 = 55^2 = 3025
Hence the difference between the sum of the squares of the first ten natural numbers and the square of the sum is 3025-385 = 2640.

Find the difference between the sum of the squares of the first one hundred natural numbers and the square of the sum.

1
2
3
4
5
6
7
8
9
10
11
function Solve-Problem6 {
  $sum = (1..100 | Measure-Object -Sum).Sum
  $sumOfSquares = (1..100 | % {$_*$_} | Measure-Object -Sum).Sum
  $sum * $sum - $sumOfSquares
}

Write-Host "Elapsed Time (s): " (Measure-Command {Solve-Problem6}).TotalSeconds
Write-Host "Solution: " (Solve-Problem6)

Elapsed Time (s):  0.0115265
Solution:  25164150

Project Euler: Problem 5 in PowerShell

| Comments

2520 is the smallest number that can be divided by each of the numbers from 1 to 10 without any remainder.

What is the smallest positive number that is evenly divisible by all of the numbers from 1 to 20?

The next problem is just folding two values from an array and replacing them with their LCM, then run again and again until only one item is left in the collection (and you now have the LCM of all entries).

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
function Get-Gcd {
  param($lhs, $rhs)
  if ($lhs -eq $rhs) { return $rhs }
  if ($lhs -gt $rhs) { $a,$b = $lhs,$rhs }
  else { $a,$b = $rhs,$lhs }
  while ($a % $b -ne 0) {
    $tmp = $a % $b
    $a,$b = $b,$tmp
  }
  return $b
}

function Get-Lcm {
  param($lhs, $rhs)
  [long][Math]::Abs($lhs * $rhs) / (Get-Gcd $lhs $rhs)
}

function Get-LcmOfGroup {
  param([int[]]$values)
  $lhs = $values[0]
  $rhs = $values[1]
  $lcm = Get-Lcm $lhs $rhs
  $values = @($lcm) + $values[2..$values.Length]
  if($values.Length -eq 1) { return $lcm }
  else { Get-LcmOfGroup $values }
}

function Solve-Problem5 {
  Get-LcmOfGroup @(1..20)
}

Write-Host "Elapsed Time (s): " (Measure-Command {Solve-Problem5}).TotalSeconds
Write-Host "Solution: " (Solve-Problem5)

Elapsed Time (s):  0.0407472
Solution:  232792560