Dears, Good Day to all. As you aware that Converting Sales Quot. to a Sales Order is simple but I did not find any option doing Convert Sales Order to Sales Invoice (Without Posting the Sales Invoice). I've seen one developer made that in NAV 2016 ( Adding new Button (Action) in the Sales Orders Page and Sales Order Cards). Just doing the same by copying the sales Order and Creating the Sales Invoice then Delete the Sales Order. Thanks in advance for help.
↧
Forum Post: Convert Sales Order to Sales Invoice (Without Posting the Sales Invoice) - Business Central Cloud
↧
Forum Post: How to Repeat execute codeunit with changecompany?
Hi All, I saw the steps to upgrade NAV from 2009R2 to NAV 2013 there is a step to do Task 4: Step 1 Data Conversion Since I have more than 50 companies, is there a way i can create code to repeat this? --> https://docs.microsoft.com/en-us/previous-versions/dynamicsnav-2013r2/hh166395 (v=nav.71) same goes for this step --> https://docs.microsoft.com/en-us/previous-versions/dynamicsnav-2013r2/hh169256 (v=nav.71) Task 11: Step 2 Data Conversion I tried using CHANGECOMPANY code, but it only works for Table but not codeunit is there a way to do so? Thanks before
↧
↧
Blog Post: Poisson d'Avril - The Sequel, part #9
Sitting in the train up North on my biweekly trip to my main employer I came across Tobi's early tweet: A big joy that this great technical mind and very nice fellow human, will be MVP for another year. He's worth so much of it, like various other MVPs. And ah, yes, July 1, which since two year now has become the regular renewal date of the MVP Award. No longer April 1 as it used to be for a quarter of the MVP's, and me belonging to the same batch. Automatically the question came to mind whether I also would be again part of this more than interesting group of people. No direct message from MS so far, but the MVP site clearly congratulates me with my MVP Award. Yesssss. For the 9th time. I always wonder if my community work of the past year is worth another reward. As there are no strict rules to what you should have done. No dashboard displaying gauges that indicate my level or progress during the year. In the end it's a couple of persons at MS that, based on your achievements, pass the judgement. Apparently for the 9th time they have found my activities worth an award. Thanx for that. It's still a joy to be part of this. Congrats to all my peers that have been "renewed". And regarding the title of this post: "Poisson d'Avril - The Sequel, part #9". Just like it and it relates to the previous posts on the same subject.
↧
Forum Post: RE: Convert Sales Order to Sales Invoice (Without Posting the Sales Invoice) - Business Central Cloud
You're right, there is no action to create invoice from an order, but I guess you can use the "Copy Document" function in this case. Create an invoice and copy order data into the new invoice. If you need an action button in the order card, then it's a bit of coding, as you described in the question. You'll need to automate copying in the code.
↧
Forum Post: RE: How to Repeat execute codeunit with changecompany?
It works in a different way for codeunits. You can't change company for a codeunit, but can execute it in a separate session which is started in a context of a company specified in the session parameters. STARTSESSION(SessionId, , ); https://docs.microsoft.com/en-us/dynamics-nav/startsession-function--sessions -
↧
↧
Forum Post: RE: How to Repeat execute codeunit with changecompany?
Hi Alexander I tested Startsession like this Company.RESET; IF Company.FINDSET THEN BEGIN REPEAT STARTSESSION(SessionID,CODEUNIT::"Upgrade New Version",Company.Name); UNTIL Company.NEXT = 0; MESSAGE('Done !'); END; but seems not working. any idea?
↧
Blog Post: Extending Role Centers
There should be a simple rule that applies to all members of our community. If you struggle with something and find out the answer, then share it. If everyone starts doing that I’m sure it will improve our community and bring back some joy in the world were even most MVP’s just blog so sell their marketing messages. Again, today I wasted much valuable time of my life chasing weird behaviour in our ForNAV AppSource solution. As most partners we struggle to find a good balance in simplicity and I had this great idea (or at least I thought it was a great idea) to personalise the role center. I’ve added a menu item for our AppSource solution that contains the shortcuts that most people need and I wanted to hide those unimportant. When working with page extensions I always try to work with AddLast to have the smallest change of breaking during upgrades. You can see that the visibility of one of the items is toggled. Here is my code Simple right? Only show the Replace Reports to an admin or anyone who can manage their own data. But then the trouble came. This compiles, builds and publishes just fine but the visibility is not toggled. Then I remembered something from C/Side. IT IS NOT ALLOWED TO CODE IN ROLE CENTERS!!! And I am extending a Role Center. So I guess you can call this a bug. When extending a rolecenter it should not be possible to write code. For me this is back to the drawing board and I will try to come up with a better solution.
↧
Forum Post: RE: How to Repeat execute codeunit with changecompany?
The idea is that either the codeunit does not have any code in the OnRun trigger, or there is an error that has to be debugged.
↧
Blog Post: AL Extension Pack for VSCode
I totally forgot to blog about this – so let me quickly catch up with this one.. :-). Some time ago, after explaining my most used VSCode Extensions for AL Development for (about) the 829th time – I decided to make my life a bit easier. I already came across the concept of creating a VSCode Extension that act like packages that automatically install other extensions. An “Extension Pack”, if you will ;-). So … here is the … AL Extension Pack You can find it here: https://marketplace.visualstudio.com/items?itemName=waldo.al-extension-pack And obviously, you can also download it from within VSCode What does it do? Well, if you install the AL Extension Pack in VSCode, it will simply automagically install all these extensions as well: CRS AL Language Extension AL Variable Helper AL Code Outline AL Object Designer Create GUID vscode-icons AL Language Bracket Pair Colorizer 2 Docker Explorer GitLens Git History PowerShell snippet-creator Rest Client TODO Highlight TODO Tree These are the extensions that I think are indispensable to decently develop for Microsoft Dynamics 365 Business Central . Even more, if you install the “AL Extension Pack”, you will automatically get new extensions that I would include in the pack. So … If you have ANY … Feedback Please, do not hesitate to provide it in the issues-section of the github repository of this extension here: https://github.com/waldo1001/ALExtensionPack/issues . Is there anything missing? Is there a better one? Anything! Always appreciated! Enjoy!
↧
↧
Comment on Weekend with David
In start of my career at best essay services I used to have so much free time because of lesser work load. This gave me so much time for fun weekends and glass painting classes. I actually learned glass making by professionals and want to open an institute some time in future.
↧
Blog Post: Free Open-Source Translation tool for Dynamics 365 BC
Making apps I have tried different translation tools, both on-premise and cloud versions, but I have been disappointed every time. I mean, how hard can it be: I want to be able to create multiple projects for translation. I want to be able to translate the same project to multiple languages. I want to be able to import the source file. Then I want the tool to be able to translate automatically. However ridiculous the translations might be, most are actually ok. I want the translation to be free or maybe even to be able to access multiple APIs (to do). I want to be able to make project specific terms that might not be translatable to be inserted into the translated text. I want the tool to identify multiple instances of a phrase and translate all at once. I want to be able to import a translated file fully or partially and use my previous translation. And I want it to be simple and intuitive. So, having checked a number of tools, I decided to make one myself. I have made an extension for Dynamics 365 Business Central, and not only that, I have made it Public Domain, so that everybody can help developing it. You can find the beta on my public GitHub account: https://github.com/peikba/AL-Translate-Tool It works like this: Import the AL Translation Tool extension in a Dynamics 365 Business Central Now the AL Translation Tool appears in the Departments menu or in the search Start by setting up the module in the Setup page: The Default Source code language will be defaulted for all new projects and since the native language of Dynamics 365 Business Central is ENU, then that will always be the source. Use Free Google Translate API. The limitation is that it is only possible to access the API a limited number of times each hour. This is included because maybe someone wants to add other APIs later. Project no. series like everywhere else in Dynamics 365 Business Central. Next are the General Translation Terms that will apply to all projects. It can be a bit difficult because the translation tools tend to translate the same phrase with different words, so it is the translated word that must be included here. Another use could be to change field names that would be totally different in the different languages. The examples here are in Danish but it is clear that the translation has used different phrases for the same word. Then it is necessary to add the Language ISO codes to the Language table: Now it is time to create the first project: I only add the name and the source language code. I can import the source from my extension already now, in this case I import the Manufacturing Plus.g.xlf file directly from my Dynamics 365 BC extension using the Import Source action: It is possible to see the Translation Source by clicking the Translation Source action: The left side are all the captions from the extension and the right side are the Developers Notes that was created by the AL extension in Visual Studio Code. Re-importing the Translation Source will not delete the old source, but add any extra fields to the Translation Source and Target. Then I go back to the project to create one Target Language for each language I want to translate to. I only need to add the language code: In the Translation Terms page, it is possible to copy from the General Translation Terms : It is also possible to add extra Translation Terms from this project to the General Translation Terms . Back to the Target Language page, it is possible to access the Translation Target page: Every time it is accessed, it will create all new Target translations from the Source file: Here are two extra fields: Translate: Mark this if the source should be included in the Translate All action. After the automatic translation has been made the field will automatically be cleared. The Translate action ignores that and translates regardless. Target: This will be the translation that can originate from either: The Google Translate Maybe combined with a Translation Term Or manual translation. On entering a manual translation for a source phrase with multiple instances, it is possible to confirm changing all instances at once. Here is also an extra FactBox showing how many instances of the source phrase exists. Using the Translate button, the Google Translate API will be called and the result will be validated against the project Translation Terms : In some cases, the setup will be translated, but in this case, it didn't and I need to translate it myself. Notice that it translated both the table and the page. The origin is noted in the Developers Notes FactBox: Changing it manually, will give the following message: And now it is correct. Using the Translate All action, I firstly need to mark the lines I want to include, so I click Deselect All , mark a number of lines and click Select All : And then click Translate All . I haven't found the exact number of translations per hour but I think it's around 30. Now let's take a look at the result: It is obvious that the Translation Terms has kicked in, and that a few terms are missing: purchase creditmemo , SKU , docking , so they need to be added to the terms, and I can mark the lines to rerun and then run the translation again: And the result: I can then export the translation file to a new .xlf file, in this case, the name will be: Manufacturing Plus.g.DA-DK.xlf And then include it in my AL extension project, recompile, publish and Install. The result is here: It is obvious that I have made a partial translation. The fields marked in red was not included. If I already have a full or partial translation file, then it is possible to import the file directly into the Target Language page: A warning will be given: Answering yes, the file is imported. In this case, it was a full translation from previous: Now I can keep working with this file and export it again. It is possible to import translation files with on part of the solution translated. Issues Just now, the Developers Notes are not exported correctly. Anybody who wants to give it a shot, please contact me directly. The issues are described in the Issues List Word file. So, download it, play with it and give me your comments.
↧
Blog Post: Meta UI Control Add-Ins for Business Central, how and why…
Today we’ve successfully completed a workshop at Vos Transport with Global Mediator where we did a demo/prototype of a new user interface defined on metadata embedded in Microsoft Dynamics NAV using client add-ins. We’ve been working on preparing this workshop for over 12 months and I am very happy and proud of the result. Client Add-Ins Within Microsoft Dynamics NAV, client add-ins have been possible for a long time, yet not a lot of partners have picked up the technology except for edge cases such as Point of Sale and Rich Text Editors. The problem we are trying to solve for Vos Transport is giving insight in the data to the users of the planning system. This is a customized part of their NAV system but you could compare it to Manufaturing or Service order scheduling challenges. For many years we’ve struggled with the limitations of the NAV user interface such as lack of drag and drop, resizable rowheigts, conditional coloring, concattenating columns and double/right mouseclick events. On top of that we wanted visual insights in the planning using either time lines and/or visual components in maps with modern options like heatmaps. I’ve seen other partners build external components for this, mostly using Web Services but I’ve never felt comfortable with these since they make navigating back and forth to NAV very hard. Flexible & Low-Code Even though I am proud of our internal IT department at Vos, we cannot take the responsibilty of creating our own UI for something like this, and even if we could, we lack the in-house skills for such a level of front end development. This is why we started to talk with Global Mediator to build a brand new Page Designer that allows a new level of UI flexibilty in NAV/Business Central. The goal is that any NAV Admin person can configure these pages without any knowledge of complex frameworks. Just a little knowledge about the NAV datamodel, basic HTML and JavaScript will sufice. The later only for conditional formatting to create simple boolean expressions. The Result In essence the Meta UI tool allows you to convert any list page in NAV to a new format where any formatting rule can be applied, rows can be concattenated, pyjama printing can be applied and if you want, one column can be rendered as a representation of a time line. The tool allows expandable subpages where multiple rows can be expanded at the same time allowing us to drag and drop for example a sales line from one sales order to another sales order. Microsoft Graph API & JavaScript Add-Ins The technology used behind the scenes are the new Microsoft Graph API and JavaScript Add-Ins. Both introduced in NAV2018 making that the “oldest” version to support the Meta UI. If you want to build a Meta UI page on a cusomised table you need to generate the API definition which will automatically be added to the endpoint by the NAV framework. Next Steps The Meta UI will be taken into production this weekend and roled out for a few pilot users to give feedback, especially about the performance. In September we will evaluate the experience and define if the tool will be implemented accross the whole company in all our offices across Europe. Roadmap We’ve already started the scope for the next project using the Meta UI tool which will include an HTML editor and a PDF viewer which is going to help us convert orders we receive in PDF format more automatically. Are you interested? Now that we’ve proven the technology and moved the first customer into production we want to talk to other NAV/Business Central partners who have the same need for a more advanced grid component, map controls, PDF viewers or HTML editors. Maybe you will challenge us to add more components to the toolbox. It requires a minimal learning curve to get started and shows great results very quickly. I will share some screen shots very soon as we will do some clean up first based on the workshops feedback.
↧
Forum Post: Can we make ExternalName as dynamic for ExternalSQL ?
Hi all, i am using External SQL to push data from Company A (DB-01) ---> B,C,D,E,etc (DB-02) i can push Data from Company A to Company B but can we make company name in ExternalName dynamically? so that i dont have to change properties 1 by 1 Thanks
↧
↧
Blog Post: Dynamics 365 Business Central + Azure Cosmos DB for globally distributed integrations
Today’s post wants to give an overview of a successfully deployed scenario of a geo-distributed integration between Dynamics 365 Business Central and some local B2B applications and it wants to leave a message to all: enlarge your vision to other Azure services if you want to create globally distributed architectures that rocks! The scenario: The headquarter of a large commercial company is located on West Europe and it’s using Dynamics 365 Business Central SaaS as the main ERP system. This company creates items and when these items are ready for selling worldwide, they have the need to distribute their database (product catalog) to every point of selling in the world (on many countries on different continents in the globe). The point of selling (shops) have locally installed apps that reads this product catalog to immediately know what items can be sold, price, availability and so on. The first solution: The fist solution was to use Dynamics 365 Business Central APIs. The geo-distributed local apps call custom D365BC API pages and retrieves the data. This solution had two main “problems”: some contries experiences long latency on retrieving data too much API requests are forwarded to the D365BC tenant during the day The second solution: The second solution was to use Azure SQL as an “intermediate layer”: D365BC sends items records to an Azure SQL database by calling an Azure Function and then the local apps calls other Azure Functions for retrieving the data from this “centralized” Azure SQL database instance. Pros and cons of this solution: The D365BC tenant does not have to handle too much requests for data some contries experiences long latency on retrieving data Cost increased The main problem to handle for having a good architecture for this scenario was to avoid big latencies on some countries. For reaching this goal, data should be as near as possible to the local application (so on every contries where the company has a shop). For this business case, data has no strictly the need to have a relational database under the hood, but we can use also a NoSQL database. Final solution: The solution that reached all the goals is described in the following schema: For this solution we’re using an instance of Azure Cosmos DB for storing item’s data. Azure Cosmos DB is Microsoft’s globally distributed, low-latency, high throughput, always on, multi-model database service. Azure Cosmos DB is a no-sql database and one of the key benefits of Azure Cosmos DB is that it transparently replicates your data wherever your users are, so your users can interact with a replica of the data that is closest to them. Azure Cosmos DB allows you to add or remove any of the Azure regions to your Cosmos account at any time, with a click of a button. In our scenario, we have an instance of Azure Cosmos DB in West Europe. On the same region we have an Azure Function that is called from Dynamics 365 Business Central for storing item’s data into an Azure Cosmos DB document collection. Items are stored as JSON documents. Then, the Azure Cosmos DB database is geo-replicated into N different Azure regions we need for our business. The local applications rerieve the item’s data by calling an Azure Function that in turns retrieves the data from it’s nearest Azure Cosmos DB database. The calls is managed by an Azure Traffic Manager that redirects it to the right Azure Function (region). The main Azure Cosmos DB is created from the Azure Portal as follows: The Azure Function used for sending the items data from Dynamics 365 Business Central to Azure Cosmos DB is an Http Trigger deployed to the same Azure region. The function (called SendItemsToCosmosDB ) uses the Microsoft.Azure.DocumentDB.Core package and it’s defined as follows: [FunctionName("SendItemsToCosmosDB")] public static async Task Run( [HttpTrigger(AuthorizationLevel.Function, "post", Route = null)] HttpRequest req, ILogger log, Microsoft.Azure.WebJobs.ExecutionContext context) { log.LogInformation("C# HTTP trigger function processed a request."); //Reads incoming JSON string requestBody = await new StreamReader(req.Body).ReadToEndAsync(); dynamic data = JsonConvert.DeserializeObject(requestBody); //Read settings.json var config = new ConfigurationBuilder() .SetBasePath(context.FunctionAppDirectory) .AddJsonFile("local.settings.json", optional: true, reloadOnChange: true) .AddEnvironmentVariables() .Build(); CosmosDBEndpoint = config["CosmosDBEndpoint"]; CosmosDBMasterKey = config["CosmosDBMasterKey"]; SaveItemToCosmosDB(data); return ItemNo != null ? (ActionResult)new OkObjectResult($"Item {ItemNo} stored.") : new BadRequestObjectResult("Error on input data."); } The Azure Function supports the http POST method. It receive a JSON document as input (item representation), it retrieves the Azure Cosmos DB parameters from the configuration settings (endpoint and primary key, you can retrieve them from Azure Portal by going to your Cosmos DB database and clicking on Keys ) and then saves the item document (JSON) to the Azure Cosmos DB main instance. The method that stored the JSON data on Azure Cosmos DB is called SaveItemToCosmosDB and it’s defined as follows: private static void SaveItemToCosmosDB(dynamic Item) { Task.Run(async () => { using (var client = new DocumentClient(new Uri(CosmosDBEndpoint), CosmosDBMasterKey)) { //Create new Cosmos DB database (if not exists) var databaseDefinition = new Database { Id = "ItemDb" }; var database = await client.CreateDatabaseIfNotExistsAsync(databaseDefinition); //Create a new database collection (a database is a container that holds a number of collections) var collectionDefinition = new DocumentCollection { Id = "ItemCollection" }; var collection = await client.CreateDocumentCollectionIfNotExistsAsync(UriFactory.CreateDatabaseUri("ItemDb"), collectionDefinition); //Insert the document in the collection //To insert new document, you need two things: //Path to collection: dbo /{ databaseName}/ colls /{ collectionName} //Document Object var itemDocument = await client.CreateDocumentAsync( UriFactory.CreateDocumentCollectionUri("ItemDb", "ItemCollection"),Item); } }).Wait(); } I’ve placed comments between code lines, so I think it’s quite self-explanatory: we create a new database (if it doesn not exists, here called ItemDB ), we create a Document Collection ((if it doesn not exists, here called ItemCollection ) and we create a document on this collection with the JSON received as input from Dynamics 365 Business Central). This Azure Function is called from AL by passing a JSON representation of an Item. The AL procedure code is as follows: procedure SendItem(ItemNo: Code[20]) var httpClient: HttpClient; httpContent: HttpContent; jsonBody: text; httpResponse: HttpResponseMessage; httpHeader: HttpHeaders; Item: record Item; begin jsonBody := ' {"itemNo":"' + item."No." + '","itemDescription":"' + Item.Description + '","itemEnabledForSelling":' + format(Item."Enabled For Global Selling") +'}'; httpContent.WriteFrom(jsonBody); httpContent.GetHeaders(httpHeader); httpHeader.Remove('Content-Type'); httpHeader.Add('Content-Type', 'application/json'); httpClient.Post(BaseFunctionURL, httpContent, httpResponse); //Here we should read the response message('Item registered on Azure Cosmos DB.'); end; When the Azure Function is called, we have an HTTP response like the following: If we check our Azure Cosmos DB via the data explorer in Azure Portal, we can see that the item document is stored: Azure Cosmos DB adds an ID to the document record itself + other internal fields. The main Azure Cosmos DB database is geo-replicated on every Azure region we need for our business. The geo-replication can be easily performed directly via the Azure Portal by selection your Azure Cosmos DB instance, then going on Replicate data globally . Here you can select with a click all the region where you want the database replica and if you want a read-only replica or also write enabled: Thats’s done! Your no-sql database is replicated geographically! The Azure Cosmos DB is created by selecting the Core (SQL) API (see figure 1). This means that your database supports querying items using Structured Query Language (SQL) as a JSON query language, so you can perform sQL-like query on your unstructured data for retrieving items. The Azure Function that retrieves the data (here called GetItemsFromCosmosDB ) is defined as follows: [FunctionName("GetItemsFromCosmosDB")] public static async Task GetItems( [HttpTrigger(AuthorizationLevel.Function, "get", Route = null)] HttpRequest req, ILogger log, Microsoft.Azure.WebJobs.ExecutionContext context) { log.LogInformation("C# HTTP trigger function processed a request."); string requestBody = await new StreamReader(req.Body).ReadToEndAsync(); //Read settings.json var config = new ConfigurationBuilder() .SetBasePath(context.FunctionAppDirectory) .AddJsonFile("local.settings.json", optional: true, reloadOnChange: true) .AddEnvironmentVariables() .Build(); CosmosDBEndpoint = config["CosmosDBEndpoint"]; CosmosDBMasterKey = config["CosmosDBMasterKey"]; string jsonResponse = GetCosmosDBItems(); return jsonResponse != null ? (ActionResult)new OkObjectResult(jsonResponse) : new BadRequestObjectResult("Error on retrieving data."); } The method that retrieves the Item document from Azure Cosmos DB (here called GetCosmosDBItems ) is defined as follows: private static string GetCosmosDBItems() { string Json = string.Empty; using (var client = new DocumentClient(new Uri(CosmosDBEndpoint), CosmosDBMasterKey)) { var response = client.CreateDocumentQuery(UriFactory.CreateDocumentCollectionUri("ItemDb", "ItemCollection"), "select * from c").ToList(); Json = JsonConvert.SerializeObject(response); } return Json; } The methods performs a sql-like query on our collection and returns all data. Obviously, you can adapt this method to perform the query you need. If you send a GET request to this Azure Function, this is the response received: It returns the JSON representation of your document collection data (our Item records stored). Gain on performances and latency? From 5% (same Azure Region as the Dynamics 365 Business Central instance) to more than 50% (or remote regions). I think this is something that must be keep in mind…
↧
Blog Post: Delivering Application Symbols to enable hybrid deployments
If you are like us at Axians Infoma and can’t make the transition of your on-premises solution from C/AL to AL with a single snap of your fingers, you may want to use hybrid deployments as a first step. This means shipping your standard C/AL solution and publishing dependent extensions on top of that... Read the full text.
↧
Blog Post: Using Gmail Account with SMTP Setup with Dynamics Business Central / Dynamics NAV
Overview Not all customers uses Office 365 as their e-mail. The other popular option companies use is G Suite from Google. When you’re trying to setup the SMTP Mail using Gmail accounts in G Suite, you may encounter this error: The mail system returned the following error: “Failure sending mail. Unable to read data from the transport connection: net_io_connectionclosed.”. Resolution The problem is with how Google detects which application it deems as less secure. If you’re using an application it deems less secure, Google will refuse the connection. What you’ll need to do is change your settings on Gmail. On the Less Secure App access, you will need to turn this on. Conclusion Now when you go back to your Dynamics 365 Business Central (aka Dynamics NAV) application, you will be able to send the mail from the SMTP settings.
↧
Forum Post: BC SaaS Model not allowed to modify VAT Entry?
Hi all, is BC SaaS version not allowed to modify tabledata? I have created extension with permission to modify table 254 (VAT Entry) but when i do posting , there is error message that I am not allowed to modify TableData VAT Entry all along i did this using on premise version (either BC 130 or BC 140 has no issue) but when I implemented for SaaS model, i got this error Please advise. Thanks
↧
↧
Forum Post: RE: Report Makeing
[quote userid="57886" url="~/nav/f/developers/96895/report-makeing"] want create a Report Navision 2016 table data[/quote] Hi. To be honest, I read several times and can't understand what report you want to create.
↧
Forum Post: Anyone Knows Where to find the proper resources fro LS Retail ,LS NAV 2016 or 2018?
Hi Everyone, LS Retail is a hot cake in the retail industry, required by ample of employers? I Have been getting lots of offers for LS retail, It looks interesting but no idea where to start from? Looking forward to receiving valuable info from my experienced DUG community members!
↧
Blog Post: Opinion – What will happen in fall with Business Central
As far as I remember, social media around Business Central/NAV(ision) has never been as quiet as in the last months. There is nothing from the MVP’s anymore and it seems the majority of partners are in the dark about that’s going on at Microsoft. While this is true, Microsoft is updating the GitHub with the new AL foundation periodically with new code but without explaining the strategy. ( https://github.com/microsoft/ALAppExtensions ) In fall, Microsoft will release the first Business Central without C/Side and refactored AL and because of that the shipping of daily insider builds has been blocked for a few months. In the “old days” Microsoft would have code freeze before summer vacation and partners would get a build (DVD) which was very close to what Microsoft would ship at Directions. Now there is nothing, except the GitHub without any guidelines on how to use it. I’ve seen speculation that foundation will be dozens of small extensions, but I think this is a false rumour. My expectation is that each BC install will have three extensions. Foundation This is what you see on GitHub today. Most of which is stuff that should probably be part of the AL programming language such as TempBlob and Excel Buffer. Things NUGET offers for DotNET. Application This is what we have today as General Ledger, Inventory, Sales & Purchase, Jobs, Manufacturing etc. This will be based on Foundation, but not broken up into smaller extensions even though that would be my preferred choice. Microsoft should have started that years ago and now they simply lack the time and skills. They fired most of the functional folks years ago in favor of a large platform and UI team. This will lead to much rewrite of code, but not as much as most think. After this I hope and expect that the design of the Application will be frozen since Microsoft cannot expect their partners to continuously refactor their code. We simply don’t have the resources to do that. Most partners don’t even have automated testing in place and refactoring is too expensive. Large partners can write their own Application on top of Foundation allowing them to be on AppSource without Microsoft having to add half a million events for each business case. Localizations This is probably the coolest part if Microsoft can pull it off. Each localization will be an extension on top of Foundation and Application Speculation & Opinion I’m writing this in order to start a discussion and get some feedback what others expect. The information in this blog post is in no way confirmed by Microsoft.
↧