Quantcast
Channel: Dynamics 365 Business Central/NAV User Group
Viewing all 11285 articles
Browse latest View live

Forum Post: MICR String on check report in dynamics 365 business central saas version

$
0
0
I am working on dynamics 365 business central . i have created custom check report and downloaded MICR Encoding format . its working good for word document but for PDF document its not working. the font is not converting to MICR Encoding

Forum Post: Merge Multiple Posted Sales Invoice in to one document

$
0
0
Good Morning, We are currently using NAV2009 and are in the process of upgrading to Dynamics Business Central. We have multiple customers who require a single invoice for all purchase orders. Often items ship separately, which creates multiple invoices for their single PO. Is there a way to Merge Multiple Posted Sales Invoices in to a single document? Thank you, TrishSafariMicro

Forum Post: RE: Merge Multiple Posted Sales Invoice in to one document

$
0
0
And good evening to you :-) If invoices are already posted, then you can't combine them into one document. But it is possible to post a single invoice for several receipts. 1. Create a purchase invoice and fill only the header information - vendor, posting date. 2. Run the function Get Receipt Lines. I don't remember what this action looks like in 2009, but in later versoins it is Line | Functions | Get Receipt Lines. Run this action and select all receipts you want to invoice. Now this invoice contains information from all selected receipts - when it is posted, all the related orders are invoiced.

Blog Post: Directions US 2019 – I appear to have work to do …

$
0
0
I’m returning back from yet another great Directions US event in Las Vegas this time. At this point – I’m sitting in the lounge in New York – a great opportunity to do a short post about something I didn’t really expect, and now I’m quite obligated to do.. . I total, I did about 13 hours of sessions and workshops. That’s a lot. I never was busier – mainly because of Vjeko, who couldn’t join this year, and where I agreed to take over his sessions on Artificial Intelligence. That was a great experience – not only because it was very well prepared by Vjeko, but also for me the perfect opportunity to dive into the matter. I loved it! DevOps But that’s not really what I want to put on the table. I also had a session (+ workshops) about DevOps together with Freddy. My part was “ DevOps in practice “, where I showed how we implemented DevOps in our company, which is a VAR and ISV – so basically: to solve the real-world development challenges in a typical Business Central partner company. The main part was that we developed a DevOps Extension for managing all the customers and products in a standard way in DevOps. My intention was to just show it as an idea on how you can manage multiple repositories in DevOps in a very maintainable way – but after the session, I had so many people that were interested in the extension. Well – my call to action is clear: let’s get it out there! I can’t tell you how I will do that, but I will definitely do my best to get it out there for you to be able to easily set up builds for products and customers, including dependencies, versioning, testability, … in a matter of minutes! Dependency Analysis I was also involved in a session regarding “Migrating your solution to extensions” together with Gary. In that session, I basically did again a part on “how do we do it”, where I put a lot of attention to – what we call – a “ dependency analysis “. Just because in my opinion, if you think ahead, you’ll have much more chance to do it right. This analysis is mostly an automated way to visualize the dependencies of your code, based on the code that you already have written in C/AL. You might already guess that the main tool we used for this is the one I addressed in this blogpost . We combined that with an Extension and webgraphviz to visualize. I showed how we were able to basically completely visualize our codebase like this: And turned that into a dependent app collection, like this: This is a generated graph, by analyzing code and tagging objects – so we are quite sure it makes sense ;-). Again, lots of people were interested and wanted to do the same – so my call to action is to see what I can do to help you in this ;-). Stay tuned for more! If you will be at “ Days Of Knowledge ” in two weeks – I will be talking about both topics in my session “development methodologies for the future”. Thanks again for once again an outstanding edition of Directions US! – Always a blast to meet everyone!

Comment on Days of Knowledge – New BC Conference in Denmark

$
0
0
This conference is going to be really interesting. The speakers that are going to be at the conference are really talented and at australia writings you can read more about them. Thanks for sharing this information with us here. The medium of this conference is going to be in English which will be great as majority of the people over the world can understand it which will boost communication.

Blog Post: Version Manager shows off at Directions NA in Las Vegas

$
0
0
Thank you for four hectic days in Las Vegas, Nevada. We took Version Manager out for a spin at Directions NA, and the interest was fantastic. Using the functionality of Version Manager to help the company SOX compliance, has proven to be a great idea. In 2002, the United States Congress passed the Sarbanes-Oxley Act (SOX) to protect shareholders and the general public from accounting errors and fraudulent practices in enterprises, and to improve the accuracy of corporate disclosures. The act sets deadlines for compliance and publishes rules on requirements. Congressmen Paul Sarbanes and Michael Oxley drafted the act with the goal of improving corporate governance and accountability, in light of the financial scandals that occurred at Enron, WorldCom, and Tyco, among others. All public companies now must comply with SOX, both on the financial side and on the IT side. The way in which IT departments store corporate electronic records changed as a result of SOX. While the act does not specify how a business should store records or establish a set of business practices, it does define which records should be stored and the length of time for the storage. To comply with SOX, corporations must save all business records, including electronic records and electronic messages, for "not less than five years." Consequences for noncompliance include fines or imprisonment, or both. Version Manager can help you create an audit trail of all changes made to the source code of any on-premise version of Dynamics NAV from versions 4.0 to Dynamics 365 Business central. Only requirement of versions from 4.0 to 2009 is that they run SQL. In later versions, Version Manager will also keep an audit trail of installed and uninstalled extensions. The primary functionality of Version Manager is: Check our home page for more information at http://versionmanager.dk .

Forum Post: RE: Set Job Task No. with extension

$
0
0
Thanks Alexander, I'll work on it, and tell if it works. Really appreciate!

Forum Post: Adding nodes with JsonTextWritter

$
0
0
Hi everyone, Probably is an easy question, but I can't find the way to do this. This is the structure of an XMLPort: I need to generate a JSON with that structure, and I'm trying in this way: LOCAL CreateSimpleJsonFile(VAR JSonResponse : DotNet "Newtonsoft.Json.Linq.JObject";peShopItem : Record "eShop Item") JSonResponse := JSonResponse.JObject(); JsonTextWriter := JSonResponse.CreateWriter(); JsonTextWriter.WritePropertyName('Ref'); JsonTextWriter.WriteValue(peShopItem."No."); JsonTextWriter.WritePropertyName('DeleForEshop'); JsonTextWriter.WriteValue(peShopItem."Delete-for Eshop"); JsonTextWriter.WritePropertyName('Picture1'); JsonTextWriter.WriteValue(peShopItem."Picture 1"); JsonTextWriter.WritePropertyName('Category'); JsonTextWriter.WriteValue(peShopItem."Item Category Code"); JsonTextWriter.WritePropertyName('Subcategory'); JsonTextWriter.WriteValue(peShopItem."Product Group Code"); JsonTextWriter.WritePropertyName('SubcDescription'); JsonTextWriter.WriteValue(peShopItem."Item Group Description"); JsonTextWriter.WritePropertyName('Range'); JsonTextWriter.WriteValue(peShopItem.Range); JsonTextWriter.WritePropertyName('Reading'); JsonTextWriter.WriteValue(peShopItem.Reading); JsonTextWriter.WritePropertyName('WeightKg'); JsonTextWriter.WriteValue(peShopItem."Weight (Kg)"); leShopItembyShop.RESET; leShopItembyShop.SETRANGE("Item No.",peShopItem."No."); IF leShopItembyShop.FINDSET THEN REPEAT JsonTextWriter.WritePropertyName('eShopURL'); JsonTextWriter.WriteValue(leShopItembyShop."eShop URL"); JsonTextWriter.WritePropertyName('Enabled'); JsonTextWriter.WriteValue(leShopItembyShop.Enabled); JsonTextWriter.WritePropertyName('Description'); JsonTextWriter.WriteValue(leShopItembyShop.Description); JsonTextWriter.WritePropertyName('Availability'); JsonTextWriter.WriteValue(leShopItembyShop.Availability); JsonTextWriter.WritePropertyName('GenericPrice'); JsonTextWriter.WriteValue(leShopItembyShop."Generic Price"); JsonTextWriter.WritePropertyName('InTransit'); JsonTextWriter.WriteValue(leShopItembyShop."Qty. on Purch. Order"); JsonTextWriter.WritePropertyName('FromTransitDate'); JsonTextWriter.WriteValue(leShopItembyShop."Next Reception Date"); JsonTextWriter.WritePropertyName('InCentral'); JsonTextWriter.WriteValue(leShopItembyShop."Qty. on Origin"); JsonTextWriter.WritePropertyName('FromCentralDate'); JsonTextWriter.WriteValue(leShopItembyShop."Recep. Date-from Origin"); JsonTextWriter.WritePropertyName('FromCentralDateText'); JsonTextWriter.WriteValue(leShopItembyShop."Recep.-from Origin (Text)"); JsonTextWriter.WritePropertyName('URLTechnicalCard'); JsonTextWriter.WriteValue(leShopItembyShop."Technical Card url"); UNTIL leShopItembyShop.NEXT = 0; As you can imagine, after the findset I should create another deeper level of the json, but this is not working. What sentence should I use? Thank you very much

Forum Post: RE: Adding nodes with JsonTextWritter

$
0
0
use Newtonsoft.Json.JsonConvert.'Newtonsoft.Json, Version=6.0.0.0, Culture=neutral, PublicKeyToken=30ad4fe6b2a6aeed' and Newtonsoft.Json.Formatting.'Newtonsoft.Json, Version=6.0.0.0, Culture=neutral, PublicKeyToken=30ad4fe6b2a6aeed' Run xmlport.EXPORT to create a XLDomDocument. Convert the XML to JSon using: JSonText := .SerializeXmlNode(XmlDocument.DocumentElement,JSONFormatting.Indented,TRUE);

Forum Post: RE: Adding nodes with JsonTextWritter

$
0
0
Thanks for your answer. This was my first approach: ExporteShopItemJSON(VAR myXMLPort : Integer) myJSON : Text blob.INIT; blob.Blob.CREATEOUTSTREAM(myOutStream); XMLPORT.EXPORT(myXMLPort, myOutStream); ConvertXMLToJSON(blob); CLEAR(gJSONBig); myJSON := blob.ReadAsText('',TEXTENCODING::UTF8); MyFile.CREATE('C:\Users\Public\Documents\eshop\JSON\JsonTest'+FORMAT(TODAY,0,' ')+'_'+FORMAT(TIME,0,' . . ')+'_DESA.json'); MyFile.CREATEOUTSTREAM(myOutStream); myOutStream.WRITETEXT(myJSON); LOCAL ConvertXMLToJSON(VAR TempBlob : Record TempBlob) TempBlob.Blob.CREATEINSTREAM(myInStream); XmlDocument := XmlDocument.XmlDocument; XmlDocument.Load(myInStream); myJSON := JSONConvert.SerializeXmlNode(XmlDocument.DocumentElement,JSONFormatting.Indented,TRUE); TempBlob.INIT; TempBlob.Blob.CREATEOUTSTREAM(myOutStream,TEXTENCODING::UTF8); myOutStream.WRITETEXT(myJSON); But when the XML file is quite big, the process fails, and the next proceses to. This procedure is called from the winServer task manager. After I must send the json file to a rest api, and all the process is repeated every hour, so after one failure, all the processes are cracking. I realized that the problem is that if the XML is too big, the conversion is failling, that's why I'm trying to generate directly the json file

Forum Post: Archive Sales Blanket Orders?

$
0
0
Seems like there is no functionality to archive sales and purchase blanket orders - correct?

Forum Post: RE: Adding nodes with JsonTextWritter

$
0
0
JsonWriter is not aware that you want to write hierarchical structure, and of course, it will continue writing a linear list. To create a hierarchy, call WriteStartObject before the loop, and WriteEndObject after it. JsonTextWriter.WriteValue(peShopItem."Weight (Kg)"); JsonTextWriter.WritePropertyName('ByShop'); JsonTextWriter.WriteStartObject; leShopItembyShop.RESET; leShopItembyShop.SETRANGE("Item No.",peShopItem."No."); IF leShopItembyShop.FINDSET THEN REPEAT UNTIL... JsonTextWriter.WriteEndObject;

Blog Post: Microsoft Dynamics 365 Business Central Spring update changed “some” field lengths

$
0
0
For long, we (as a partner community) have been asking for longer field lengths. And this time, Microsoft seems to have delivered: about 860 field lengths were changed. Mostly from Text50 to Text100, but also from Text30 to Text50 and so on. For your convenience, I created a csv on my “CALAnalysis”-repo on github that lists all of them. Caveats With this change, there are some caveats that I can think of – and there might be even more then I list here below. First of all: what if a Business Central database contains C/AL customizations or AL extensions or apps on AppSource? If a database is being upgraded to the Spring Release, and in your code, you assign the field that is now Text100 to some other field in your solution that is probably still just Text50 – there is a danger for an overflow, which (I think) is a runtime error. In case of extensions, in theory, code analysis should catch this. You should get “overflow” warnings when you compile your code against symbols from BC version 14. For an app on AppSource, ISVs should already have caught it this way in their build against the “next version” (insider) – if they had set that up of course. For “Per Tenant Extensions”, you typically don’t set that up, so I’d strongly advise to start checking all these extensions, as unexpected errors might happen once people start using these lengths.. . I only can imagine what kind of situations that would cause – it could turn into a support-nightmare.. . For customizations (I mean ANY solutions that are still in C/AL), you are quite “bleeped” (sorry for my French). Because we don’t have code analysis there -no way any compiler is going to help us. So if you upgrade your C/AL to the spring release, I’d highly advise you that you really take this into account. I will suggest a few ways to handle this going further in this blogpost.. . Another caveat I can think of is reports . All of a sudden an item description can be 100 characters. I don’t know any report that fits 100 chars as a description – or I am missing something. And do realize, we are talking about 860 fields here, not “just” the Item Description. So potentially any kind of text-field on a report can run into not being fully displayed.. . Don’t know about you, but I know my customers will not appreciate this. What can you do? Code Analysis The most obvious thing to do is compile your code against the latest release. With this, for any kind of extension, and especially the ones that are on Business Central SaaS (since this will get upgraded automatically), you should set up a scheduled build against the “next release”. You can easily do that with DevOps in combination with Docker. And in these cases, it’s highly valuable, as you would catch many of these possible overflows like that. Though, I don’t know about TRANSFERFIELDS and may be other statements that assigns values to variables. Please pay enough attention to this. The way to solve this is to also change the field lengths of your fields. Not by cropping any values and losing data in the tables where it would end up obviously. Roll Back Microsoft’s changes This probably sounds like the most ridiculous option your can do. I mean: change 860 field lengths back to their original lengths? Are you kidding me? Well, first of all, in my opinion, it’s a valid option for OnPrem C/AL solutions (obviously not for any kind of extension/app solution). It’s actually what we did – at least as a temporary way to go. The reason is twofold: first of all, this is our last C/AL release of our product. Next release will be full-extension. If we would do any migration to our new solution, then there is no problem. Second reason: this was the only way we could 100% identify all the changes we needed to do to make the whole solution stable again (frankly, we weren’t waiting for bigger lengths). Because after quite some attempts, there was no way for us to identify where we assign these changed fields, and where it would eventually end up in any kind of custom fields that might cause a problem. May be a third reason of the two I was going to mention is: We could automate this. This is the simple script that we used to roll back these changes ( my colleague created this one, which was twice as fast as my version ;-)): https://github.com/waldo1001/Waldo.Model.Tools/blob/master/ChangeObjects/RestoreFieldLengths.ps1 Change all Text-datatypes to Text100 If we can do the above, we would also be able to identify all Text-fields with a length lower than 100 – and we would be able to change them to 100, right? It’s actually fairly easy to do, but in my opinion, not 100% safe (as some Text-fields were changed to 260 or even 2048).  And obviously, it wouldn’t solve the report-problem … . So we didn’t go for this option, and I can’t get you the script, but at least you have the building blocks to do it ;-). How did I analyze this? Well, as you might have read, I used this code analysis tool that we created internally (and blogged about in this post ). The script to compare the fieldlengths and list the differences in datatype, you can find on my github here: https://github.com/waldo1001/Waldo.Model.Tools/blob/master/Analyze/CompareFieldLengths.ps1 . So – we survived the snap – hope you will as well ;-).

Forum Post: RE: Adding nodes with JsonTextWritter

$
0
0
Yes, thank you!!! I knew it should be something that easty, really appreciated!!! edit: I will always have to "Byshop" nodes, but with this code, I will always write the second one, even in the debugger I see that it's finding both of them: { "Ref": "1108-150", "DeleForEshop": false, "Picture1": "http://www.xxxx.com/epicture/xxxx/1108-150W.jpg", "Category": "xxxxx", "Subcategory": "02", "SubcDescription": "Calibres", "Range": "0-150mm/0-6\"", "Reading": "0.01mm/0.0005\"", "WeightKg": 0.3875, "ByShop": { "eShopURL": "eShop2", "Enabled": true, "Description": "Calibre Digital 0-150mm/0-6\"", "Availability": 335.0, "GenericPrice": 0.0, "InTransit": 0.0, "FromTransitDate": "0001-01-01T00:00:00", "InCentral": 0.0, "FromCentralDate": "0001-01-01T00:00:00", "FromCentralDateText": "", "URLTechnicalCard": "http://www.xxxxx.com/pdf/xxxxx/esp/1108.pdf" } } May be the jsonTextWritter is overwritting the first one?

Forum Post: RE: Adding nodes with JsonTextWritter

$
0
0
I've tried this code: JSonResponse := JSonResponse.JObject(); JsonTextWriter := JSonResponse.CreateWriter(); JsonTextWriter.Formatting := 1; JsonTextWriter.WritePropertyName('Ref'); JsonTextWriter.WriteValue(peShopItem."No."); JsonTextWriter.WritePropertyName('ByShop'); leShopItembyShop.RESET; leShopItembyShop.SETRANGE("Item No.",peShopItem."No."); IF leShopItembyShop.FINDSET THEN REPEAT JsonTextWriter.WriteStartObject; JsonTextWriter.WritePropertyName('eShopURL'); JsonTextWriter.WriteValue(leShopItembyShop."eShop URL"); JsonTextWriter.WriteEndObject; UNTIL leShopItembyShop.NEXT = 0; Adding writeStart and wirteEnd inside the repeat, but i get this error: "Error in the cal ... with the message...."

Forum Post: RE: Adding nodes with JsonTextWritter

$
0
0
You still need to create a container for your objects before entering the loop - this code is trying to assign multiple values to a single property 'ByShop', but the property value must be atomic. If you want to create several objects inside the loop, then you need an array. Leave the WriteStartObject and WriteEndObject inside the loop, but in this case, add WriteStartArray and WriteEndArray outside of it.

Forum Post: RE: Adding nodes with JsonTextWritter

$
0
0
Yes, you did it again! Really appreciated Alexander, Thank you very much!!!!

Forum Post: RE: reopened delivered(closed) sales order

$
0
0
Tanks a lot .it was helpful answer but not exactly what I wanted to know however the problem solved. the problem was sending remaining item for customer while we closed sales order before.Not store the remaining Item in warehouse and bins after shipping.

Forum Post: RE: reopened delivered(closed) sales order

$
0
0
[quote userid="205109" url="~/nav/f/developers/96736/reopened-delivered-closed-sales-order/502914"]the problem was sending remaining item for customer while we closed sales order before[/quote] Ok, it is very good. But what was the issue - you need to create new Warehouse Shipment Doc => create and register the Warehouse|Inventory Pick => post warehouse Shipment

Comment on Navision 3.70.B US/CAN database

$
0
0
Hello Erik , Do you have a link to download FR version For NAV 3.70 Thanks!
Viewing all 11285 articles
Browse latest View live