Skip to content

A compare tool dedicated to nav object text files and its pitfalls

Since i wrote about it yesterday on luc van vugts blog, i think it’s now time to tell about a project i’ve been working on for quite some time.

The project is about a tool that enables you to bypass some of the pitfalls of nav text object comparing, when done in a regular text-comapre tool like wimerge, beyondcompare and so on.

The basic concept of the tool is to take the files, parse them, generate objects out of the different parts and threat them in the context they have. This enables to do context sensitive comparing of  these objects. An typical example for this would be 2 version of an object, for example table 27 (Item), one of them from a W1 Database and one of a localized Database. Now you want to merge some code done in either of the versions to the other one. But you are forced to go through all the captions shown as differnces, which slows down your merging process.

There are many such cases where a traditional text-compare won’t give you a good view on what is actually different on those objects. And here comes the contextual compare in the game.

With the tool i’am working on you will be able to do things like:
– Ignore missing captions
– Route Different captions (eg. Set a rule that ENG = ENU)
– Identify Parts that are contextual the same (like 2 Variables with the same name) but are in different places inside the text-file, those will be thrated as equal not different
– Set different strategies to find the coresponding part in the other object (Find by ID, by Name, by SourceExpr, etc…)
– Various ignore filters
– Save merging (no id conflicts, new id’s will be crated)
– modified files will always be consistens, nomore missing ‘{‘ brackets because of manual file editing
– and many more

There are also plans to include handy features like some sort of a object analyzer (Where used, etc), a easy object renumbering, batch code insertion, etc.

But the main advantage is that the different parts of the objects are threated and compared as such, and not just a  flat textfile. You should be able to decrease your comparing effort a lot when dealing with structure changes or captions and lots of other things.

A special feature but maby very handy one is the form designer as you can see it in picture 2. this will enable you to efectivly merge forms with direct preview of what you will get. This no longer forces you to check your fields in the nav designer to ensure they will be shown correctly. The desginer enables you to drag and drop controls, resize them, etc.

But because i think a picture tells more then 1000 words, have a look at this example:

If you have andy questions about the tool, or you think you have a nice feature in mind that should be implemented, please feel free to contact me, or post a comment!

Advertisements

Standard Security Model & Custom Fields on the User Table

Another day, another problem 😉

Last friday we had a issue when moving a sql database from an old server to a newer one. The problem was that suddenly some poeple were not able to print some reports.
Long story made short:

The issue was realted to custom fields on Table 2000000002 User. This fields were updated when the print button gets pressed and that is very bad when using standard security model. When using standard model all permissions gets synchronized when the user table is modified, since we do this before printing in the context of a normal user which doesn’t have the db_securityadmin right in sql, this will fail, and the error message will be this:

"The Microsoft Dynamics NAV Classic and SQL Server security systems have not been successfully synchronized.  The SQL Server login <User> does not exist on the <Server> server."

I don’t realy know why it worked on the old server and to be true, i don’t want to know… because if every user is able to synchronize permissions, somethings seriously wrong  imo (btw: this security setting weren’t done by us).
So to solve the issue, we moved the fields from User table to User Setup Table and all is working fine.

So if you ever need to have fields on the User Table then consider moving them to the User Setup table because they belong there anyway.

Error when adding a sql user that was deleted before

Recently we encountered an sql problem that you get a sql error message when adding a user in navision that once existed in the database, then was deleted.

The error message says that the user already exists. But the error message is wrong 😉 because you removed the user so it doesn’t exist right?
Part of… the problem is when you remove the suer in navision it will be removed in the NAV Database but the user exists in the master db too.  This entry is created when you enter the user name in the User Table in NAV. Since the entry is not removed when you delete the User in NAV, you have to manualy delete the user in the master table in order to be able to add the User in NAV!

So if are not able to create a user in nav because it already exists, go to the master table and search for an existing entry for that user.

Read Content of NAV Notes/BLOBS from SQL

Today somebody asked about extracting the values of a note field in Table Record Link from SQL (http://www.mibuso.com/forum/viewtopic.php?f=32&t=51435&p=249029#p249029)

As we had that question coming up earlier this week (funny ha?) i did some testing and heres a possible solution for this:

Firstly you need a dot net assembly with a static method that uses a binary reader to read the data stored in the blob field. Here is the sample:

public class NAVNoteHandler
{
[SqlFunction(DataAccess = DataAccessKind.Read)]
public static string GetNoteAsString(string NoteID)
{
try
{
using (SqlConnection connection = new SqlConnection("context connection=true"))
{
connection.Open();
SqlCommand command = new SqlCommand(string.Format("select [Note] FROM [Record Link] WHERE [Link ID] = {0}", NoteID), connection);

SqlDataReader reader = command.ExecuteReader();

if (reader.Read())
{
SqlBytes bytes = reader.GetSqlBytes(0);
MemoryStream ms = new MemoryStream(bytes.Value);

BinaryReader BinReader = new BinaryReader(ms);
return BinReader.ReadString();
}
}
return "null";
}
catch (Exception ex)
{
return string.Empty;
}
}
}

Note: The Assembly must be compiled for the same platform as the sql server is running. So make sure you create a 32-bit assembly if you have a 32-bit sql server but a 64-bit if your sql-server is a 64-bit one.

This assembly can be now registered to sql server so you can use it like a built in function. You can read more about that here: http://www.codeproject.com/KB/database/ … n_SQL.aspx

The required sql statements to register the assembly are the following:

exec sp_configure 'clr enabled',1
reconfigure
go

CREATE ASSEMBLY NAVNoteHandler
AUTHORIZATION dbo
FROM 'C:\Assemblies\NAVNoteHandler.dll'
GO

use [Demo Database NAV (6-0)]
go
CREATE FUNCTION f_NAVNoteHandler_GetNoteAsString
(
 @id as nvarchar(200)
)
RETURNS nvarchar(200)
AS EXTERNAL NAME NAVNoteHandler.[SQLProcedures.NAVNoteHandler].GetNoteAsString
GO

You can now read the Note as a string with this call:

Select [Demo Database NAV (6-0)].[dbo].[f_NAVNoteHandler_GetNoteAsString]('1')  as [Text]

As you may notice if you tried to call the function right now, it will return you garbage, because BLOB fields in NAV are compressed by default. To retrieve a readable version of the BLOB’s content you need to set the property “compressed” of the field in NAV to false!

Display Text from a Blob Field in RDLC Report

With reference to my post i did yesterday and a question that came up an mibuso i found another great way to use the StreamReader Class of .Net for a simple NAV problem.

If you want to display text that is stored in binary format in a report you would need to use looping and reading the content of the blob piece for piece.

With .net Interop we can do this a lot better. The StreamReader class is ideal to handle this. It is capable to read the text inside the blob, detects the correct charset and gives us back the whole content with one method call.

Simply define a variable of type dotnet and subtype:

 mscorlib, Version=2.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089'.System.IO.StreamReader

and you need a InStream Variable

Use this code to create an instance of the StreamReader:

IF ISSERVICETIER THEN BEGIN
  TempBlob.Blob.CREATEINSTREAM(IStream);
  StreamReader := StreamReader.StreamReader(IStream,TRUE);
END;

As Source Expression of the reports textbox (in classic section designer) you set the following:

StreamReader.ReadToEnd()

If the report still needs to be executable in classic client, set the textbox to visible = false.

In the RDLC Layout you can assign any textbox with the dataset field (set CanGrow = True to let it display the whole content).

Clean workaround for wrong special characters in BigText variables

In a previous blogpost about webservices i wrote about saving the result of a xmlport to a bigtext variable and the problems with charsets and special characters like ä,ö,ü etc.

Originaly i did a conversion of the whole output from ansi2ascii which worked but was not very performant…

Now with .net Interop we have a very clean an nice way of doing this. Here is the code for it:

TempBLOB.INIT;
TempBLOB.Blob.CREATEOUTSTREAM(XMLOutStream);

XMLServicePlanChart.SETDESTINATION(XMLOutStream);
XMLServicePlanChart.EXPORT();

TempBLOB.INSERT;
TempBLOB.Blob.CREATEINSTREAM(XMLInStream);

StreamReader := StreamReader.StreamReader(XMLInStream, TRUE);
_BigTextVar.ADDTEXT(StreamReader.ReadToEnd());

StreamReader is a dotnet variable:

Name DataType Subtype Length
StreamReader DotNet ‘mscorlib, Version=2.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089’.System.IO.StreamReader

As a result you have the bigtext variable with the correct content, in the correct encoding as you would expect it.

How to save space in RDLC Documents (dynamic header-height workaround)

As you may experienced, in RDLC Document Reports like Sales Order, Invoice, etc. You face the problem of loosing lot of space for header information, which mostly only need to be printed at the first page, but not on the following ones.

This is due to the lak of a dynamic Header Height, or like in classic report design, where you can have more then one header and only show them on first page / all other pages.

In RDLC we only have one Header Section with a fixed height. This brings us in situations where we have report layouts like this:


This is a W1 Standard Purchase Quote Report, the header information is printed on every page but most likely you would want to have that only printed on first page and on all other pages you want just a report title, date, page no. etc but not all info.

This can be achieved by a simple trick. You make the header as small as you would like it to be on repeating pages ( > page 1). To do this you need to move all necessary texboxes to the list of the body-section.

Sencond you need to manage the visibility of the Textboces you only want on page one to be displayed. Best is to use this kind of expression:

=Code.GetGroupPageNumber(ReportItems!NewPage.Value,Globals!PageNumber) > 1)

After that process the layout should look something like this: (This is another report and has customer changes, but you’ll get the point..)


This is the basic concept, witch this you have different headers for different pages.

This can make problems when used with inside document page breaks (like in DACH version) where you manually break pages based on for example a sales line type. This can lead to the header being showed after such a break. To resolve this issue you simply put all body texboces in a rectangle, the put this rectangle in a new table header. In this Header you set the hidden expression to something like this:

=(Cstr(First(Fields!Sales_Line___No__Caption.Value)) > "")

You can take any Value of the lines, they must all be empty on printing of page 1.

The result at the end looks like this: