Friday, August 18, 2017

Entity Framework Core 2.0 and LINQ2DB Performace

We will look at EF Core 2.0 performance comparing to LINQ2DB - the fastest nowdays ORM. And also Entity Framework 6.

Hardware used: i5-4200H, DDR3-1600, Win 10 x64 1607.

Software used: SQL Server 2016 SP1, VS 2017, .NET 4.6.1, EF 6.1.3, LINQ to DB 1.8.3, EF Core 2.0.

Northwind database will be used.

You can see SQL-queries and testing methods in one of previous article.

Simple TOP 10 query

Here and below the gray part of bar is context initialization.

EF Core still have a big overhead comparing to ADO.NET and LINQ2DB on simple queries. In my opinion performance impact can be from 50-70% for simple systems to 5-10% for enterprise systems (this is for simple queries).

Also you can see EF Core can't execute raw SQL queries fast. And EF Core can't execute custom result - you limited to entities.

Compiled EF Core queries are a bit faster. (but still far from ADO.NET or LINQ2DB).

Simple TOP 500 query

EF Core queries (both LINQ and raw SQL) performance is near Entity Framework 6. ADO.NET and LINQ2DB are a little faster.

EF Core raw SQL queries are dramatically slow, so you won't see them later.

Complex TOP 10 query

For complex queries EF Core looks pretty well and can be compared to ADO.NET and LINQ2DB. Note that ADO.NET and LINQ2DB raw queries are a bit faster.

Complex TOP 500 query

Complex queries with many rows - both LINQ and compiled EF Core queries performance is as ADO.NET and LINQ2DB.

Conclusions

EF Core 2.0 still can't be used for raw SQL queries nor for speed nor for impossibility of executing custom data result.

A bad thing also is that EF Core is slower for simple queries.

The one good thing about EF Core is that it nearly fast for complex queries as ADO.NET and LINQ2DB (but anyway a bit slower).

Raw results (Excel).

View project source code at Bitbucket.

Friday, February 24, 2017

Reflection vs Compiled Expression Performace

Performance of reflection and compiled expressions will be shown in this post.

There's a nice library ObjectListView which has lots of features, and also easy to use. Because it's not need to fill ListViewItem manually.

For User class:

class User
{
    public int Id;
    public string Name;
    public DateTime BirthDate;
}

instead of this code:

var lvis = new List<ListViewItem>();
foreach (var user in users)
{
    lvis.Add(new ListViewItem(new[]
    {
        user.Id.ToString(),
        user.Name,
        user.BirthDate.ToString(),
    }));
}

you can simply pass collection of your own classes:

objectListView.Objects = users;

This library is an example of where reflection can be used.

But what should be used - reflection, or compiled expressions, or emit? The following tests will show. Except emit - it won't be tested because it's difficult to use it. Assumption about emit can be made looking on manual (speed) and compiled expression (startup overhead) tests.

Three tests will be made:

  1. Manual.
  2. Reflection.
  3. Compiled expression.

Each test consists of 200 iteration for warmup and 200 iterations for test itself.

Every test creates list of ListViewItem for specified object type. Except manual test which is only for User type.

Hardware: i5-4200H, DDR3-1600, Win 10 x64 1607. Software: VS 2015, .NET 4.6.1.

Code for manual test:

public static List<ListViewItem> CreateListItemsManual(List<User> users)
{
    var items = new List<ListViewItem>();
    foreach (var user in users)
    {
        var subitems = new[]
    {
            user.Id.ToString(),
            user.Name,
            user.BirthDate.ToString("dd.MM.yyyy (ddd)"),
        };
        var lvi = new ListViewItem(subitems);
        items.Add(lvi);
    }
    return items;
}

Code for reflection test:

public static List<ListViewItem> CreateListItemsReflection(Type type, IEnumerable<object> users)
{
    var items = new List<ListViewItem>();
    var fields = type.GetFields();
    foreach (var user in users)
    {
        var subitems = new string[fields.Length];
        for (int i = 0; i < fields.Length; i++)
        {
            string value;
            var field = fields[i];
            if (field.FieldType == typeof(string))
            {
                value = (string)field.GetValue(user);
            }
            else if (field.FieldType == typeof(int))
            {
                value = ((int)field.GetValue(user)).ToString();
            }
            else if (field.FieldType == typeof(DateTime))
            {
                value = ((DateTime)field.GetValue(user)).ToString("dd.MM.yyyy (ddd)");
            }
            else
            {
                value = field.GetValue(user).ToString();
            }
            subitems[i] = value;
        }
        var lvi = new ListViewItem(subitems);
        items.Add(lvi);
    }
    return items;
}

Code for compiled expression test:

public static List<ListViewItem> CreateListItemsCompiledExpression(Type type, IEnumerable<object> users)
{
    var items = new List<ListViewItem>();
    var fields = type.GetFields();
    Func<object, string>[] fieldGetters = new Func<object, string>[fields.Length];
    for (int i = 0; i < fields.Length; i++)
    {
        Func<object, string> fieldGetter;
        Expression<Func<object, string>> lambda;
        var field = fields[i];
        // user => 
        var userObject = Expression.Parameter(typeof(object), "user");
        // user => (User)user
        var user = Expression.Convert(userObject, type);
        // user => ((User)user)."Field"
        var fld = Expression.Field(user, field);
        if (field.FieldType == typeof(string))
        {
            // user => ((User)user)."Field"
            lambda = Expression.Lambda<Func<object, string>>(fld, userObject);
        }
        else if (field.FieldType == typeof(int))
        {
            // user => ((User)user)."Field".ToString() // int.ToString()
            var toString = Expression.Call(fld, typeof(int).GetMethod("ToString", new Type[0]));
            lambda = Expression.Lambda<Func<object, string>>(toString, userObject);
        }
        else if (field.FieldType == typeof(DateTime))
        {
            // user => ((User)user)."Field".ToString("dd.MM.yyyy (ddd)")
            var toString = Expression.Call(
                fld,
                typeof(DateTime).GetMethod("ToString", new Type[] { typeof(string) }),
                Expression.Constant("dd.MM.yyyy (ddd)"));
            lambda = Expression.Lambda<Func<object, string>>(toString, userObject);
        }
        else
        {
            // user => ((User)user)."Field".ToString() // object.ToString()
            var toString = Expression.Call(fld, typeof(object).GetMethod("ToString", new Type[0]));
            lambda = Expression.Lambda<Func<object, string>>(toString, userObject);
        }
        fieldGetter = lambda.Compile();
        fieldGetters[i] = fieldGetter;
    }
    foreach (var user in users)
    {
        var subitems = new string[fields.Length];
        for (int i = 0; i < fields.Length; i++)
        {
            subitems[i] = fieldGetters[i](user);
        }
        var lvi = new ListViewItem(subitems);
        items.Add(lvi);
    }
    return items;
}

Results

There's no much difference in absolute time for case with not many items - ~0.5 ms. This time is startup overhead for expressions compilation. It doesn't make sense for UI - nobody can see 0.5 ms difference.

Let's see the whole graph below.

Reflection is slower for about 6-7 ms for 20,000 elements. Again, this is not the time that anyone can see in UI.

But what should be used in real life projects? Is it worth to write code for universal and simple usage using reflection/expression, or it's better to spend time and write specific code for every type manually to achieve best performance for both little and many elements?

For UI components, if it's definitely known that there won't be many elements, reflection can be used.

But what if it's server application and/or there can be cases with both little and many elements, and/or performance is required? Already for 100-200 elements first graph shows ~1.5x performance difference between manual and reflection methods.

Fortunately, in real applications used types are not being changed all the time while program runs. This means that once expressions are compiled they can be cached.

This way allows to use compiled expressions without startup overhead.

Script with raw (200 iterations) results (R).

View project source code at Bitbucket.

Saturday, February 11, 2017

EF Core vs LINQ2DB

Entity Framework Core recently got v1.1.0. Though it still lacks some critical features like "GROUP BY" SQL translation (see its roadmap) it's time to test it.

The following frameworks will be tested:

  1. Entity Framework CodeFirst (LINQ query, models generated from DB)
  2. Entity Framework (raw SQL query)
  3. ADO.NET
  4. LINQ to DB (LINQ query, model entities generated from DB)
  5. LINQ to DB (raw SQL query)
  6. Entity Framework Core (doesn't support raw SQL execution at this moment)

Hardware used: i5-4200H, DDR3-1600, Win 10 x64 1607.

Software used: SQL Server 2016 SP1, VS 2015, .NET 4.6.1, EF 6.1.3, LINQ to DB 1.7.5, EF Core 1.1.0.

And default Northwind database.

The tests are the same as in one of the previous articles.

Note: EF Core doesn't use "GROUP BY" in generated SQL, instead it processes it in memory. This can lead to high load on the database in production.

Context Initialization

EF Core's context initialization is twice faster than EF 6. It matters for simple and fast queries.

Simple TOP 10 query

Here and below the grey part of bar is context initialization.

We can see that EF Core is faster than EF 6 when running simple queries. Though it's faster than EF 6 both in context initialization as well as in everything else but it still slower twice than LINQ2DB in overall.

Depending on the usage it might not be so bad, because absolute time is low.

Simple TOP 500 query

Results are almost the same, but now EF Core not to far from ADO.NET and LINQ2DB.

Complex TOP 10 query

Almost no difference between frameworks, except EF 6 which is 2x slower than others.

Complex TOP 500 query

The complex query with many result rows makes all frameworks nearly the same (again except EF 6 which is 2x slower than others).

Conclusions

EF Core is faster than EF 6. It's a good thing. But it still can't use "GROUP BY" clause in SQL although it's 1.1.0 version released. It's bad.

Another bad thing about EF Core is that it doesn't support raw SQL execution. It almost doesn't matter for complex queries, but usually applications have many simple queries, and here EF Core is weak - it can't be optimized more. Change tracking doesn't affect selects, and the only way for optimization is raw SQL.

So, if performance is not significant then EF Core can be chose. Otherwise, it's even EF 6 might be more preferable because it supports raw SQL execution which will help in heavy queries.

And if performance is important, or if change tracking is not required, then LINQ2DB may be the best choice. LINQ2DB's LINQ queries are not very slower than raw ADO.NET even for simple queries. And if it's not enough then raw SQL can be used. LINQ2DB is not new, so it hasn't such plenty of bugs as EF Core now.

Raw results (Excel).

View project source code at Bitbucket.

Sunday, March 22, 2015

Performance of LINQ to DB vs Entity Framework vs BLToolkit vs ADO.NET

Last years BLToolkit is being developed slowly. The reason is its author Igor Tkachev decided to write a new ORM - LINQ to DB.

He says it provides fastest LINQ database access. And it supports 12 database providers including MSSQL, SQLite, Postgres. And it supports mass UPDATE and DELETE, and Bulk Copy.

Also, LINQ to DB provides mechanism similar to EF Code First generator. It has T4 template that generates code structure from the database. All you need is to put connection string and execute T4 template.

BLToolkit also supports LINQ, but it's main strength is not only speed but also mapping.

Let's compare it with other ORMs.

These tests were performed on i5-4200H, DDR3-1600, Win 8.1 x64, SQL Server 2014, VS 2013, .NET 4.5, EF 6.1.2, BLTookit 4.2.0, LINQ to DB 1.0.7.1. Default Northwind database was used.

Used tests are the same as in previous article.

There were 6 methods tested to work with database:

  1. DbContext CodeFirst (LINQ query, models generated from DB)
  2. DbContext CodeFirst (raw SQL query)
  3. ADO.NET
  4. Business Logic Toolkit (raw SQL query)
  5. LINQ to DB (LINQ query, models entities generated from DB)
  6. LINQ to DB (raw SQL query)

Context Initialization

EF CodeFirst with LINQ query takes much more time that others. LINQ to DB with LINQ query and CF wih raw SQL take nearly the same time, and take 2-3 times more than other raw SQL. But anyway for complex queries it doesn't matter much.

Simple TOP 10 query

Here and below the grey part of bar is context initialization.

For simple query, LINQ to DB query using LINQ takes twice more time than for raw SQL. But anyway it's much faster than EF, and even slightly faster that CF with raw SQL.

Simple TOP 500 query

When number of result rows is not small, then even for simple query difference is not much. It's because mapping takes significant part of time, and compilation of simple LINQ query is not so much. And we can see that new architecture for linq2db is more faster than BLToolkit. Even LINQ query with linq2db is faster that raw SQL with BLToolkit. LINQ to DB with raw SQL query is the same speed as ASO.NET.

Complex TOP 10 query

LINQ to DB compilation is very fast. This makes its LINQ query speed almost the same as ADO.NET. EF CF with LINQ query takes twice more time than others.

Complex TOP 500 query

Complex TOP 500 query results are the same.

Conclusions

LINQ to DB is very fast both with raw SQL or LINQ query. For simple and small queries it's possible to use raw SQL instead of LINQ, but absolute time makes almost no difference.

LINQ to DB is good choice for ORM if you don't need change-tracking. And if you don't need all BLToolkit mapping capabilities (linq2db supports type-to-type mapping).

Raw results (XSLT).

View project source code at Bitbucket.

Thursday, January 8, 2015

Entity Framework DbContext vs ObjectContext vs LINQ2SQL vs ADO.NET vs Business Logic Toolkit Performance

With Entity Framework Microsoft recommends using DbContext instead of ObjectContext. So let's compare their performance.

These tests were performed on i5-4200H, DDR3-1600, Win 8.1 x64, SQL Server 2014, VS 2013, .NET 4.5, EF 6.1.2. Default Northwind database was used.

Tests include two different queries (simple, complex) and two lengths (10, 500 rows). Simple query:

SELECT TOP 10 O.OrderID, O.OrderDate, C.Country, C.CompanyName
FROM Orders O
JOIN Customers C ON O.CustomerID = C.CustomerID

Complex query:

SELECT TOP 10 OD.Quantity, OD.UnitPrice, OD.Discount, O.ShipCountry, S.Country
FROM Orders O
JOIN [Order Details] OD ON O.OrderID = OD.OrderID
JOIN Products P ON OD.ProductID = P.ProductID
JOIN Categories Cat ON P.CategoryID = Cat.CategoryID
JOIN Suppliers S ON P.SupplierID = S.SupplierID
WHERE
    Cat.CategoryID IN (@categoryIds)
    AND S.SupplierID IN (@supplierIds)
ORDER BY OD.Discount DESC

There were 6 methods tested to work with database:

  1. DbContext CodeFirst (generated from DB)
  2. DbContext Designer (generated from DB)
  3. ObjectContext (generated from DB with EdmGen.exe)
  4. LINQ2SQL
  5. ADO.NET
  6. Business Logic Toolkit (raw SQL query)

Each method was tested with 1000 iterations (and 100 iterations to warm up).

Context Initialization

Since context initialization can't be measured directly, it was measured in the following way. Let say we executed a query:

using (var ctx = new MyContext())
{
    var list = ctx.Products.Where(r => r.Name.Length < 10).ToList();
}

then if we executed this query twice:

using (var ctx = new MyContext())
{
    var list = ctx.Products.Where(r => r.Name.Length < 10).ToList();
    var list2 = ctx.Products.Where(r => r.Name.Length < 10).ToList();
}

we got a system of linear equations:

q + ctx = x
2*q + ctx = y

and now it's easy to find context initialization time:

ctx = 2*x - y

Context initialization was measured using "Simple TOP 10" query.

Context initialization time for DbContext CodeFirst and Designer is nearly the same, while ObjectContext requires twice more time. ADO.NET and BLToolkit have nearly the same minimum time, thrice lower than DbContext. LINQ2SQL has twice lower time than DbContext.

But as you can see below context initialization time in absolute time doesn't make a big sense always.

Simple TOP 10 query

For the simple query with a few rows where request to database takes little time, EF query compilation takes almost all the time for DbContext and ObjectContext. LINQ2SQL takes twice more time than EF, because its mapping is slow (I'll tell why I think so below in "Complex TOP 10" test). BLToolkit takes slightly more time than ADO.NET. And I don't know why Precompiled ObjectContext takes less time than ADO.NET :) (but remember, this time is without context initialization). DbContext doesn't support precompiled queries at all.

With context initialization:

Simple TOP 500 query

Simple TOP 500 query takes more time to request data from database. This is the reason why DbContext and ObjectContext take only one half time more than ADO.NET, and third time more than BLToolkit and precompiled ObjectContext query.

With context initialization:

Complex TOP 10 query

Complex TOP 10 query has similar situation: EF query compilation time is comparable to query request to databse. This is why DbContext and ObjectContext takes only twice more time that ADO.NET and BLToolit.

As you remember, LINQ2SQL in "Simple TOP 10" test took more time than EF. And in this test it takes less time. We can suppose that query execution time() consists in the following steps:

  1. Context initialization - ctx()
  2. Query compilation - comp()
  3. Request to database - db()
  4. Mapping result - map()
Below is a little math :), where "1" is "Simple TOP 10" query, and "2" is this "Complex TOP 10" query.

time(L1) = ctx(L) + comp(L1) + db(1) + map(L1)
time(L2) = ctx(L) + comp(L2) + db(2) + map(L2)
time(EF1) = ctx(EF) + comp(EF1) + db(1) + map(EF1)
time(EF2) = ctx(EF) + comp(EF2) + db(2) + map(EF2)

ctx(L) = 0.09
ctx(EF) = 0.17
time(L1) = 0.87
time(EF1) = 0.55
time(L2) = 8.8
time(EF2) = 10.5

=>

0.87 = 0.09 + comp(L1) + db(1) + map(L1)
8.8 = 0.09 + comp(L2) + db(2) + map(L2)
0.55 = 0.17 + comp(EF1) + db(1) + map(EF1)
10.5 = 0.17 + comp(EF2) + db(2) + map(EF2)

=>

0.78 = comp(L1) + db(1) + map(L1)
7.9 = comp(L2) + db(2) + map(L2)
0.38 = comp(EF1) + db(1) + map(EF1)
10.33 = comp(EF2) + db(2) + map(EF2)

db(1) << (db2)
comp(L1) << comp(L2)
comp(EF1) << comp(EF2)
map(L1) ~= map(L2) = map(L)     // we can suggest this because both queries have 10 rows
map(EF1) ~= map(EF2) = map(EF)  // we can suggest this because both queries have 10 rows

=>

0.78 = comp(L1) + db(1) + map(L)        // (1)
7.9 = comp(L2) + db(2) + map(L)         // (2)
0.38 = comp(EF1) + db(1) + map(EF)      // (3)
10.33 = comp(EF2) + db(2) + map(EF)     // (4)

db(1) << (db2)
comp(L1) << comp(L2)
comp(EF1) << comp(EF2)

=> Let subtract (3) from (1), and (4) from (2)

0.4 = comp(L1) - comp(EF1) + map(L) - map(EF)         // (1)
-2.43 = comp(L2) - comp(EF2) + map(L) - map(EF)       // (2)

comp(L1) << comp(L2)
comp(EF1) << comp(EF2)

=> Let subtract (2) from (1)

2.83 = comp(L1) - comp(EF1) + map(L) - map(EF) - comp(L2) + comp(EF2) - map(L) + map(EF)

comp(L1) << comp(L2)
comp(EF1) << comp(EF2)

=>

2.83 = comp(L1) - comp(EF1) - comp(L2) + comp(EF2)

comp(L1) << comp(L2)
comp(EF1) << comp(EF2)

=>

comp(L2) - comp(L1) + 2.83 =  comp(EF2) - comp(EF1)    // (1)

comp(L1) << comp(L2)                                   // (2)
comp(EF1) << comp(EF2)                                 // (3)

=> Using comparisons (2) and (3)

comp(L2) + 2.83 ~=  comp(EF2)

So, we can say that LINQ2SQL takes lower time for compilation than EF. And if so, then EF takes lower time to map results. As I said above.

With context initialization:

Complex TOP 500 query

Complex TOP 500 query shows the same results as Complex TOP 10: time to request from database is comparable to compilation time, therefore DbContext and ObjectContext take only twice more time that ADO.NET and BLToolit.

With context initialization:

Conclusions

  • Context initialization for DbContext CodeFirst is slightly faster than for DbContext Designer (both generated from database).
  • Context initialization for ObjectContext is twice slower than for DbContext. But absolute time is not significant - 0.4 ms versus 0.2 ms.
  • LINQ2SQL can be faster than EF for complex queries, and also it can be precompiled for some queries.
  • EF has much faster mapping than LINQ2SQL.
  • ObjectContext is a bit slower than DbContext, but some queries can be precompiled (Parameters cannot be sequences).
  • BLToolkit doesn't provide compilation-time type checking, but it's fast nearly as ADO.NET and it has great mapping capabilities (this article is in Russian (main site is currently down) but you can understand a bit from code samples).

Raw results (XSLT).

View project source code at Bitbucket.

Sunday, December 21, 2014

SQLite Entity Framework Database First Tutorial

This tutorial describes how to use SQLite database-first method with Visual Studio 2013 and Entity Framework 6. This article is based on Tomasz Maciejewski topic SQLite and Entity Framework with Visual Studio Express 2013, which shows how to generate EDMX. That method works but Visual Studio says that EDMX contains errors (though it compiles). This article's method doesn't produce errors.

I will use Visual Studio 2013, SQLite 1.0.94.0, Entity Framework 6.1.1, .NET Framework 4.5, and targeting both 32 and 64 bit platforms.

Let's start from scratch and create database for a simplest blog. There will be only posts and comments.

I used SQLiteStudio to create database. These are the tables saved in "Blog.sqlite" database:

CREATE TABLE Posts
(
    Id INTEGER PRIMARY KEY AUTOINCREMENT NOT NULL,
    Created DATETIME NOT NULL,
    Text NTEXT NOT NULL
);

CREATE TABLE Comments
(
    Id INTEGER PRIMARY KEY AUTOINCREMENT NOT NULL,
    Created DATETIME NOT NULL,
    PostId INTEGER NOT NULL REFERENCES Posts (Id),
    Text NTEXT NOT NULL
);

I'll create a console application for simplicity.

Next step is to create Entity Framework schema files. They are used to represent conceptual and store schemas and mapping between them. You'll need three things:

  1. Database file.
  2. EdmGen utility.
  3. SQLite.

They should be copied into the same directory.

Database is "Blog.sqlite" we've just created.

EdmGen is .NET utility located in "C:\Windows\Microsoft.NET\Framework64\v4.0.30319\EdmGen.exe". Copy it to "the same" directory and create "EdmGen.exe.config" file with the following content:

<configuration>
    <system.data>
        <DbProviderFactories>
            <remove invariant="System.Data.SQLite" />
            <add name="SQLite Data Provider"
                invariant="System.Data.SQLite"
                description=".NET Framework Data Provider for SQLite"
                type="System.Data.SQLite.SQLiteFactory, System.Data.SQLite" />
        </DbProviderFactories>
    </system.data>
</configuration>

Last thing is SQLite. Download precompiled binaries the same as your system (32 or 64 bit), for example Precompiled Binaries for 64-bit Windows (.NET Framework 4.5). If you need both 32 and 64 bit support you can do as the following. Download both 32 and 64 bit versions, extract one of them from ZIP, then delete "SQLite.Interop.dll", then create "x86" and "x64" directories and put 32 bit "SQLite.Interop.dll" version into "x86" directory and 64 bit version in "x64".

Now create "gen.bat" file with the following command:

EdmGen.exe /mode:fullgeneration /c:"Data Source=Blog.sqlite" /provider:System.Data.SQLite /entitycontainer:Blog /project:Blog /language:CSharp

Run "gen.bat" and the following files will be created: "Blog.csdl", "Blog.msl", "Blog.ssdl", "Blog.ObjectLayer.cs", "Blog.Views.cs":

EdmGen for Microsoft (R) .NET Framework version 4.5
Copyright (C) Microsoft Corporation. All rights reserved.

Loading database information...
Writing ssdl file...
Creating conceptual layer from storage layer...
Writing msl file...
Writing csdl file...
Writing object layer file...
Writing views file...

Generation Complete -- 0 errors, 0 warnings

Copy these 5 generated files into project directory into "DAL" folder. Create "Data" folder in the project and copy "Blog.sqlite" inside. Add these files into project. And add reference to "System.Data.Entity" assembly.

For the test, in Visual Studio go to properties of "Blog.sqlite" and set "Copy to Output Directory" parameter value to "Copy always". This option will copy empty database from project into compilation directory and you always will have empty database for every application run from VS.

The go to properties of schema files "Blog.csdl", "Blog.msl", "Blog.ssdl" and set "Build action" to "Embedded Resource" for each file. This puts them into application resources so that Entity Framework could load them.

The last thing is left. Right click on project in Solution Explorer and choose "Manage NuGet Packages...". Click "Online" in left panel and type "sqlite" in top right search field. Install "System.Data.SQLite (x86/x64)" package. This will change "app.config" file. And it will change it wrong! You have to reorder some string to get them in the following way (note the order of "remove" and "add" tags):

<DbProviderFactories>
    <remove invariant="System.Data.SQLite" />
    <add name="SQLite Data Provider" invariant="System.Data.SQLite" description=".NET Framework Data Provider for SQLite" type="System.Data.SQLite.SQLiteFactory, System.Data.SQLite" />
    <remove invariant="System.Data.SQLite.EF6" />
    <add name="SQLite Data Provider (Entity Framework 6)" invariant="System.Data.SQLite.EF6" description=".NET Framework Data Provider for SQLite (Entity Framework 6)" type="System.Data.SQLite.EF6.SQLiteProviderFactory, System.Data.SQLite.EF6" />
</DbProviderFactories>

One thing more went wrong after adding SQLite package. "System.Data.Entity" reference disappeared. Add this reference again :). And also add reference to "System.Runtime.Serialization". Now project can be compiled without errors (but it still does nothing). After compilation you'll notice that you haven't "SQLite.Interop.dll" file in destination directory, but you have two directories "x86" and "x64" containing this file for 32 and 64 bit systems. It's because "System.Data.SQLite (x86/x64)" for both 32 and 64 bit systems was installed.

Remember CSDL, SSDL and MSL schemas was added into resources. It's time to use them. EF needs a connection string that specifies all the things such as schemas, database source and data provider. Schemas are being added in resources with full name, for example "EF_SQLite_Example.DAL.Blog.csdl" for CSDL in this project. Add the following connection string in "app.config":

<connectionStrings>
    <add name="Blog"
        connectionString="metadata=res://*/EF_SQLite_Example.DAL.Blog.csdl|res://*/EF_SQLite_Example.DAL.Blog.ssdl|res://*/EF_SQLite_Example.DAL.Blog.msl;provider=System.Data.SQLite;provider connection string=&quot;data source=./Data/Blog.sqlite&quot;"
        providerName="System.Data.EntityClient" />
</connectionStrings>

Where "res://" means that schema will be taken from resources, and "*" means "current application". It's also can be written this way:

<connectionStrings>
    <add name="Blog"
        connectionString="metadata=res://*/;provider=System.Data.SQLite;provider connection string=&quot;data source=./Data/Blog.sqlite&quot;"
        providerName="System.Data.EntityClient" />
</connectionStrings>

This means that schemas will be searched in all resource files. This is shorter but a bit slower and inaccurate (you can have multiple schemas).

Add reference to "System.Configuration" to load connection string from config.

Now SQLite with EF6 can be used. Let's add some records into DB and read them.

using System;
using System.Configuration;
using System.Linq;
using Blog;

namespace EF_SQLite_Example
{
    class Program
    {
        static void Main(string[] args)
        {
            var connectionString = ConfigurationManager.ConnectionStrings["Blog"].ConnectionString;

            Console.WriteLine("Writing into database.");
            using (var ctx = new Blog.Blog(connectionString))
            {
                var post = new Posts
                {
                    Created = DateTime.UtcNow,
                    Text = "Example of post in this blog.",
                };

                var comment1 = new Comments
                {
                    Created = DateTime.UtcNow,
                    Text = "Comment 1 for the post.",
                };
                var comment2 = new Comments
                {
                    Created = DateTime.UtcNow,
                    Text = "Comment 2 for the post.",
                };
                
                post.Comments.Add(comment1);
                post.Comments.Add(comment2);

                ctx.Posts.AddObject(post);

                ctx.SaveChanges();

                Console.WriteLine("Post: Id = {0}", post.Id);
                Console.WriteLine("Comment 1: Id = {0}, PostId = {1}", comment1.Id, comment1.PostId);
                Console.WriteLine("Comment 2: Id = {0}, PostId = {1}", comment2.Id, comment2.PostId);
            }

            Console.WriteLine("\r\nReading from database.");
            using (var ctx = new Blog.Blog(connectionString))
            {
                var posts = ctx.Posts.Where(r => r.Comments.Any(r2 => r2.Text.Contains("2"))).ToList();

                foreach (var postItem in posts)
                {
                    Console.WriteLine("Post: Id = {0}", postItem.Id);
                }
            }

            Console.WriteLine("\r\nEND");
            Console.ReadKey();
        }
    }
}

Produces output:

Writing into database.
Post: Id = 1
Comment 1: Id = 1, PostId = 1
Comment 2: Id = 2, PostId = 1

Reading from database.
Post: Id = 1

Don't forget to regenerate EF schemas and models after changing database scheme.

View project source code at Bitbucket.

Wednesday, March 12, 2014

MiCalc - High-precision Calculator with Big Numbers Support

MiCalc is high-precision calculator that supports big numbers.

Features:

  • string expression;
  • result output in usual, science, hexadecimal and binary mode;
  • support of bitwise operators and trigonometry functions.

Requires .NET Framework 4.5.

Download MiCalc (115 KB)

View repository with sources