Quantcast
Channel: Dynamics 365 Business Central/NAV User Group
Viewing all 11285 articles
Browse latest View live

Blog Post: How to Avoid a Mutually Assured Destruction While Implementing a New ERP System

$
0
0
Overview While working with companies on ERP implementation (in our case, Microsoft Dynamics 365 Business Central), what I tend to notice more often than not is that companies know what they want to get out of ERPs, but not how to do it. The result is that managers create their own chaos that makes people go in different directions. It also means that internal managers often sabotage their own ERP implementations. This article is part of a series regarding how internal managers can implement better new ERP systems. All roads lead to Rome; but some of them are more steady than the others. Common System for a Common Goal It’s not rocket science: when companies decide to switch to a new ERP/accounting software to address specific problems they’re facing, it’s for a specific reason. Usually, it’s because they have reached the limits of their current software, or it would be too expansive to do otherwise. While the new ERP offers a positive cost-benefit outcome, managers need to come up with common goals when implementing the new system to ensure a successful implementation if they want to avoid a mutually assured destruction After companies shop and find the perfect system for them, the steps that follow typically are planning, configuring and then implementing the software. One modest but extremely important goal, in most cases, is to just go live with the system. Even though this is a simple and vague goal, it allows parties involved to be working on the same objective, which is to go from the old, legacy software to the new one. It also means transfering crucial information to the new system the way that fits your company’s culture and ways of working. Here are the do’s and don’t when it comes to implementing a new ERP system. Too Much Potential is like Too Little There’s one thing managers need to keep in mind while implementing a new ERP system: time is money. The potential offered by the new software may appear limitless; but while this perception of infinite improvement may prenail for a while, having too high expectations from this change could have the very opposite effect. This is why a strategy is necessary. A new ERP could help your company with (but is not limited to): Automation Better information for customers Accuracy Process re-engineering Better analysis/reporting for management While the ultimate goal of going to a new software is to make everything better, too much of a good thing may turn out to be bad. Think about it: more often than not, vendors promise the moon to their clients. When a manager watches the software’s demo, he/she will think the possibilities are infinite. But often, the honeymoon phase doesn’t last. But keep in mind that these people are salespeople, and your company needs to have a down-to-earth, strategic approach on how to implement the software. Make sure you know from the beginning what your company wants to get from the system, so you know what to prioritize. This is when the software genius comes in, and the learning curve starts to go up. Shortly after the product is acquired, the manager will look at it, learn about the new features to figure out how they can benefit the company. Suddenly, so many problems the company had with the new system are going to be solved. Suddenly, a whole new world is going to be opened to your company. Then, an infinite list of ideas will come out of the manager’s expectation of this new system. One thing to keep in mind when implementing a new ERP is that when working with developers, they are likely to say yes to most of your requests. Developers tend to see every request from a customer as a challenge, without necessarily thinking about budgets. The result: developers overpromise and managers run wild. To avoid this situation, having a specific end goal in mind, with specific areas to improve is the solution. Most people do cost-benefit analysis based on the cost and the improvements that it would bring. However, the time to get it up and running is often overlooked. When you start a bunch of projects, especially during a period of high stress such as moving to a new system, you end up stretching your resources very thin. Too often, resources are stretched indefinitely and the end goal is so evasive that even the manager tends to forget it. This is why successful ERP implementations begin with specific, realistic goals that fit your company’s culture and needs, and will save you time and money. Conclusion I’ve worked with many companies in the process of implementing new software. While I have a complete confidence that they improve the companies’ efficiency and in the long-run make everybody’s life better, my experience has shown that not having a full-length implementation strategy can cost time and energy; and since time is money, investing in time-saving strategies is the way to go. It’s simple: focus on the main task at hand. Go one step at the time. While it may take slightly more time, the end result will be less confusion and better results. The last thing you want, when implementing a new system, is to yield fear and uncertainty. Fear and uncertainty lead to failure. Strategy, coherence, and preparedness lead to success.

Blog Post: How to Avoid a Mutually Assured Destruction While Implementing a New ERP System

$
0
0
Overview While working with companies on ERP implementation (in our case, Microsoft Dynamics 365 Business Central), what I tend to notice more often than not is that companies know what they want to get out of ERPs, but not how to do it. The result is that managers create their own chaos that makes people go in different directions. It also means that internal managers often sabotage their own ERP implementations. This article is part of a series regarding how internal managers can implement better new ERP systems. All roads lead to Rome; but some of them are more steady than the others. Common System for a Common Goal It’s not rocket science: when companies decide to switch to a new ERP/accounting software to address specific problems they’re facing, it’s for a specific reason. Usually, it’s because they have reached the limits of their current software, or it would be too expansive to do otherwise. While the new ERP offers a positive cost-benefit outcome, managers need to come up with common goals when implementing the new system to ensure a successful implementation if they want to avoid a mutually assured destruction After companies shop and find the perfect system for them, the steps that follow typically are planning, configuring and then implementing the software. One modest but extremely important goal, in most cases, is to just go live with the system. Even though this is a simple and vague goal, it allows parties involved to be working on the same objective, which is to go from the old, legacy software to the new one. It also means transfering crucial information to the new system the way that fits your company’s culture and ways of working. Here are the do’s and don’t when it comes to implementing a new ERP system. Too Much Potential is like Too Little There’s one thing managers need to keep in mind while implementing a new ERP system: time is money. The potential offered by the new software may appear limitless; but while this perception of infinite improvement may prenail for a while, having too high expectations from this change could have the very opposite effect. This is why a strategy is necessary. A new ERP could help your company with (but is not limited to): Automation Better information for customers Accuracy Process re-engineering Better analysis/reporting for management While the ultimate goal of going to a new software is to make everything better, too much of a good thing may turn out to be bad. Think about it: more often than not, vendors promise the moon to their clients. When a manager watches the software’s demo, he/she will think the possibilities are infinite. But often, the honeymoon phase doesn’t last. But keep in mind that these people are salespeople, and your company needs to have a down-to-earth, strategic approach on how to implement the software. Make sure you know from the beginning what your company wants to get from the system, so you know what to prioritize. This is when the software genius comes in, and the learning curve starts to go up. Shortly after the product is acquired, the manager will look at it, learn about the new features to figure out how they can benefit the company. Suddenly, so many problems the company had with the new system are going to be solved. Suddenly, a whole new world is going to be opened to your company. Then, an infinite list of ideas will come out of the manager’s expectation of this new system. One thing to keep in mind when implementing a new ERP is that when working with developers, they are likely to say yes to most of your requests. Developers tend to see every request from a customer as a challenge, without necessarily thinking about budgets. The result: developers overpromise and managers run wild. To avoid this situation, having a specific end goal in mind, with specific areas to improve is the solution. Most people do cost-benefit analysis based on the cost and the improvements that it would bring. However, the time to get it up and running is often overlooked. When you start a bunch of projects, especially during a period of high stress such as moving to a new system, you end up stretching your resources very thin. Too often, resources are stretched indefinitely and the end goal is so evasive that even the manager tends to forget it. This is why successful ERP implementations begin with specific, realistic goals that fit your company’s culture and needs, and will save you time and money. Conclusion I’ve worked with many companies in the process of implementing new software. While I have a complete confidence that they improve the companies’ efficiency and in the long-run make everybody’s life better, my experience has shown that not having a full-length implementation strategy can cost time and energy; and since time is money, investing in time-saving strategies is the way to go. It’s simple: focus on the main task at hand. Go one step at the time. While it may take slightly more time, the end result will be less confusion and better results. The last thing you want, when implementing a new system, is to yield fear and uncertainty. Fear and uncertainty lead to failure. Strategy, coherence, and preparedness lead to success.

Blog Post: How to Avoid a Mutually Assured Destruction While Implementing a New ERP System

$
0
0
Overview While working with companies on ERP implementation (in our case, Microsoft Dynamics 365 Business Central), what I tend to notice more often than not is that companies know what they want to get out of ERPs, but not how to do it. The result is that managers create their own chaos that makes people go in different directions. It also means that internal managers often sabotage their own ERP implementations. This article is part of a series regarding how internal managers can implement better new ERP systems. All roads lead to Rome; but some of them are more steady than the others. Common System for a Common Goal It’s not rocket science: when companies decide to switch to a new ERP/accounting software to address specific problems they’re facing, it’s for a specific reason. Usually, it’s because they have reached the limits of their current software, or it would be too expansive to do otherwise. While the new ERP offers a positive cost-benefit outcome, managers need to come up with common goals when implementing the new system to ensure a successful implementation if they want to avoid a mutually assured destruction After companies shop and find the perfect system for them, the steps that follow typically are planning, configuring and then implementing the software. One modest but extremely important goal, in most cases, is to just go live with the system. Even though this is a simple and vague goal, it allows parties involved to be working on the same objective, which is to go from the old, legacy software to the new one. It also means transfering crucial information to the new system the way that fits your company’s culture and ways of working. Here are the do’s and don’t when it comes to implementing a new ERP system. Too Much Potential is like Too Little There’s one thing managers need to keep in mind while implementing a new ERP system: time is money. The potential offered by the new software may appear limitless; but while this perception of infinite improvement may prenail for a while, having too high expectations from this change could have the very opposite effect. This is why a strategy is necessary. A new ERP could help your company with (but is not limited to): Automation Better information for customers Accuracy Process re-engineering Better analysis/reporting for management While the ultimate goal of going to a new software is to make everything better, too much of a good thing may turn out to be bad. Think about it: more often than not, vendors promise the moon to their clients. When a manager watches the software’s demo, he/she will think the possibilities are infinite. But often, the honeymoon phase doesn’t last. But keep in mind that these people are salespeople, and your company needs to have a down-to-earth, strategic approach on how to implement the software. Make sure you know from the beginning what your company wants to get from the system, so you know what to prioritize. This is when the software genius comes in, and the learning curve starts to go up. Shortly after the product is acquired, the manager will look at it, learn about the new features to figure out how they can benefit the company. Suddenly, so many problems the company had with the new system are going to be solved. Suddenly, a whole new world is going to be opened to your company. Then, an infinite list of ideas will come out of the manager’s expectation of this new system. One thing to keep in mind when implementing a new ERP is that when working with developers, they are likely to say yes to most of your requests. Developers tend to see every request from a customer as a challenge, without necessarily thinking about budgets. The result: developers overpromise and managers run wild. To avoid this situation, having a specific end goal in mind, with specific areas to improve is the solution. Most people do cost-benefit analysis based on the cost and the improvements that it would bring. However, the time to get it up and running is often overlooked. When you start a bunch of projects, especially during a period of high stress such as moving to a new system, you end up stretching your resources very thin. Too often, resources are stretched indefinitely and the end goal is so evasive that even the manager tends to forget it. This is why successful ERP implementations begin with specific, realistic goals that fit your company’s culture and needs, and will save you time and money. Conclusion I’ve worked with many companies in the process of implementing new software. While I have a complete confidence that they improve the companies’ efficiency and in the long-run make everybody’s life better, my experience has shown that not having a full-length implementation strategy can cost time and energy; and since time is money, investing in time-saving strategies is the way to go. It’s simple: focus on the main task at hand. Go one step at the time. While it may take slightly more time, the end result will be less confusion and better results. The last thing you want, when implementing a new system, is to yield fear and uncertainty. Fear and uncertainty lead to failure. Strategy, coherence, and preparedness lead to success.

Forum Post: Running job queue with 30 seconds in Microsoft Dynamics NAV 2018

$
0
0
I have read a article on how to achieve 30 seconds job queue runs but I don't understand because in the job queue page, there is a field called No. of Minutes between Runs, It accepts only integer. I will prefer demonstration to discussion. Thanks.

Forum Post: Running job queue with 30 seconds in Microsoft Dynamics NAV 2018

$
0
0
I have read a article on how to achieve 30 seconds job queue runs but I don't understand because in the job queue page, there is a field called No. of Minutes between Runs, It accepts only integer. I will prefer demonstration to discussion. Thanks.

Blog Post: 365 Saturday: Business Central Day

$
0
0
For the first time, this Saturday (May, 23) 365 Saturday will host a full day event totally dedicated to Dynamics 365 Business Central . The agenda below will take place according to GMT 0 Timezone respectively: 8:00 – Track Inventory with Item Tracking – Bandam Sairam 9:00 – Handling your Dynamics 365 Business Central SaaS tenant – Stefano Demiliani 10:00 – Business Central Flash Methodology – Paul Soliman 11:00 – Accountant’s Hub for Business Central – Olister Rumao 12:00 – Document Automation for Business Central – Fabian Huber 13:00 – Using the right D365BC setup to support different Warehousing business requirements and processes – José Miguel Azevedo 14:00 – Business Central- Getting Started – Mary Thompson 15:00 – Power BI & Business Central – from zero to BI hero – Renato Fajdiga Steven Renders 16:00 – Using Power Automate to create Approvals in Business Central – Mary Thompson 17:00 – How to sync Master Data using Dynamics 365 BC with Power Automate (No Code) – José Miguel Azevedo 18:00 – Integration capabilities for Business Central. Intelligent Edge and Azure Service Bus. – Andrey Baludin 19:00 – Integration of Business Central with Customer Engagement using OData via Azure Logic Apps – Ambesh Singh We’re waiting for you…

Forum Post: Running job queue in 30 seconds with Microsoft Dynamics NAV 2018

$
0
0
Hi Team, I have read a article about how to solved job queue in 30 seconds run where the tutor made mention of creating two job queues for the same task and set manually but I don't how to achieve it with field, No. of Minute between Run. Thanks.

Forum Post: Possible to use C/side dev program on a PC on an external network to create a table?

$
0
0
The setup is: Database: Azure SQL with database auth NAV server instance: Azure VM with client and management port open to the internet Devepment PC: On a different network (local "home" pc) , with c/side, nav klient and the hosts file edited so that the Azure VM name = its public IP On the development PC the C/side dev program can open the database and NAV klient when running a page or table. However when trying to create a table it pops up an error message with some suggestions like open the management port, run c/side as admin or disable UAC (all of which has been tried). I have no problem creating tables with c/side on the Azure VM with the NAV instance or from another Azure VM on the same network. Is this issue solveable so that it is possible to use c/side remotely to create tables without installing a NAV instance or use a VPN connection to the server with the instance etc.?

Blog Post: Sending Azure Alerts to Dynamics 365 Business Central

$
0
0
When working with Dynamics 365 Business Central (but also on Azure applications in general) it’s a best and recommended practice to send telemetry data to the Azure Application Insights service for centralize monitoring. I’ve talked about that in the past here . Have you ever received the request to monitor some Azure Resources directly from inside Dynamics 365 Business Central and be able to collect metrics from Azure for statistical purposes inside your ERP? Yes, I have also this request on my “strange ideas” collection and this post describes a possible solution for this absolutely uncommon task. The request here is to monitor the usage of an Azure Virtual Machine (but you can extend this scenario to every resource on Azure) and when this usage is over a defined threshold, we want to collect this data inside Dynamics 365 Business Central. To be simple, in this scenario every time my monitored Azure Virtual Machine CPU goes over 80%, I want to open a task inside my ERP (creation of a record in a custom table). For reaching our goal we need to create an alert rule for the Azure resource that we want to monitor and this alert rule must trigger an API exposed by Dynamics 365 Business Central (in particular, a custom API because the Task table where I want to save my data is a custom table). The needed steps here are the following: Create an alert rule on the Azure Resource Configure the alert rule to trigger an Azure Function or an Azure Logic app Create an Azure Function or an Azure Logic app that receives the incoming request and calls the custom API exposed by Dynamics 365 Business Central In this post I don’t want to describe how to publish a custom API with Dynamics 365 Business Central (I assume that you’ve previously done this by creating a Task table and publishing an API page over this table). In this post I want also to use the “low code” approach, so we’ll see how to use an Azure Logic app instead of an Azure Function. When you create an alert for an Azure resource with Azure Monitor, you can activate the common alert schema definitions for webhooks, Azure Logic Apps, Azure Functions, and Azure Automation runbooks. Any alert instance describes the resource that was affected and the cause of the alert. These instances are described in the common schema in the following sections: Essentials : A set of standardized fields, common across all alert types, which describe what resource the alert is on, along with additional common alert metadata (for example, severity or description). Alert context : A set of fields that describes the cause of the alert, with fields that vary based on the alert type. For example, a metric alert includes fields like the metric name and metric value in the alert context, whereas an activity log alert has information about the event that generated the alert. A platform alert for the Percentage CPU metric has the following JSON representation: { "alertContext": { "properties": null, "conditionType": "SingleResourceMultipleMetricCriteria", "condition": { "windowSize": "PT5M", "allOf": [ { "metricName": "Percentage CPU", "metricNamespace": "Microsoft.Compute/virtualMachines", "operator": "GreaterThan", "threshold": "25", "timeAggregation": "Average", "dimensions": [ { "name": "ResourceId", "value": "3efad9dc-3d50-4eac-9c87-8b3fd6f97e4e" } ], "metricValue": 31.1105 } ], "windowStartTime": "2019-03-22T13:40:03.064Z", "windowEndTime": "2019-03-22T13:45:03.064Z" } } } More informations about the alert schema can be found at the following link: https://docs.microsoft.com/en-us/azure/azure-monitor/platform/alerts-common-schema-definitions Let’s start by creating the Azure Logic app. From the Azure Portal, create a new Azure Logic App and select the When an HTTP request is received template (this creates for you an HTTP triggered Logic App). Then click on the Use sample payload to generate schema in order to set the request body JSON schema: In the sample schema window, you can directly post the JSON schema that you can see in the above provided link on MSDN, but you can also use the Azure Monitor Metric Alert Handler template that creates the same task with the payload automatically loaded for you: This task retrieves the alert details sent by Azure Monitor (we’ll talk about this later) and now we need to call our Dynamics 365 Business Central API for storing the data we want. My custom Task API has the following endpoint: https://api.businesscentral.dynamics.com/v2.0/TENANTID/sandbox/api/sd/customapi/v1.0/companies(COMPANYID)/tasks For doing that, we cannot use the standard Dynamics 365 Business Central connector but we need to create a custom connector . We can do that directly from the Azure Portal: In the custom connector definition, we need to specify the host and the Base URL as follows: In the Security section we need to specify the authentication for the connector. For simplicity here I’m using Basic Authentication and for supporting this I need to create two parameters for storing username and web service access key (please don’t provide credentials here): Then we need to create the Actions that this connector supports. Here I need to call my Dynamics 365 Business Central custom API for inserting a new Task record, so I need to perform a POST request. We can create the action in the following way: Here, I’m creating an action called PostTask that performs a POST http request to the provided url. As you can see, I’m passing the company ID as a request url parameter to this method. In the Body section of the method definition I’m pasting the JSON body of my API (as requested by the POST). Now we need to create a response, and for the response I can simply copy the response that my API send me when I try to insert a record : The response definition will be as follows: Now our PostTask action is defined and we can click on Update connector to save our custom connector: The custom connector for our Logic App is ready to be used. Go back to our Logic App Designer page and add a new action. When choosing for the action to add, select Custom and then select the custom connecotr and the relative action that we want to use (here called Post Task ): Now we need to provide the credentials for the connector (in order to be able to access our Dynamics 365 Business Central API): and then we can insert the parameters for the Post Task action. Here I’m creating a new Task record by setting the description , dueDate and Status parameters (requested by my custom API in Dynamics 365 Business Central) with the metric values that comes from the Azure alert: Here you can do what you want accordingly to the API you’re using (Here I’m creating a Task record but you can for example store the metric details on a table and so on). The Logic App that receives the incoming Azure Alert and handles it is now ready. But how we can send an alert to this Logic App? We need to create a custom alert definition as follows. From the Azure Portal, select the resource to monitor (in my case it’s an Azure Virtual Machine), select Alerts and then create a new alert rule . Click on Select condition to configure when the alert rule will be triggered: In this sample, my rule monitors the Percentage CPU (Platform) signal and if it’s over 80% for a period of time of 15 minutes, it triggers an alert. After defining the condition for the alert, we need to define the Action group (what to do when the alert is triggered): In the Create action group window, give a name for the action group and then under the Actions tab select Action type = Logic App : As you can see here we have also the possibility to call an Azure Function (so cool!). We’ve selected Logic App , so we need to specify what is the Logic App that must be triggered when the alert occours (I’ve selected the previously created Logic App): Important here is to click on the Enable the common alert schema button in order to have the common alert schema definition enabled and so receiving the template as descibed in the beginning of this post. That’s all. We’ve created a webhooks that when the metric on our monitored Azure resource occours it send an alert to our Dynamics 365 Business Central ERP, all with a “low code” approach:

Blog Post: Getting not-out-of-the-box information with the out-of-the-box web client

$
0
0
A few days ago, I saw this tweet: Do you need to see current the database size for a company broken down into table sizes? In #MSDyn365BC 2020 release wave 1, you can do that in the new Table Information page. See more: https://t.co/ShGVYnec2g pic.twitter.com/DGHlr5WlME — Microsoft Dynamics 365 Business Central (@MSDYN365BC) May 23, 2020 And that reminded me about a question I had a few weeks ago from my consultants on how to get more object-formation from the Web Client. More in detail: in Belgium, we have 2 languages for a tiny country (NLB, FRB) that differ from the language used by developers (ENU). Meaning: consultants speak another language than the developers, resulting in misunderstandings. I actually had a very simple solution for them: The Fields Table For developers, a well known table with information about fields. But hey, since we can “run tables” in the web client (and since this is pretty safe to do since these are not editable (and shouldn’t be – but that’s another discussion :D)), it was pretty easy to show the consultants an easy way to run tables. It’s very well described by Microsoft on Microsoft Docs. Just add “table= ” in the URL the right way, and you’re good to go. So for running the “Fields table”, you could be using this URL: https://businesscentral.dynamics.com/? table=2000000041 And look at that wealth of information: Data types Field names Field captions depending on the language you’re working in Obsolete information Data Classification information .. All a consultant could dream of to decently describe change requests and point developers to the right data, tables and fields. This made me wonder though: And can we easily even more from the web client? Not all of the Business Central users, customers, consultants, … are developers. So, can we still access this kind of information, without the access to code, VSCode or anything like that? Yes we can. In fact, the starting point should be: how do I find objects? Is there a list with objects? And therefore also a list with these so-called system tables? Well, you’ll need to … learn how to find “AllObj”, and you’ll find it all! AllObj is a system table that houses all objects (including the objects from Extensions), so if you go to this “kind of” url, you’ll find all objects in your system: https://businesscentral.dynamics.com/?table=2000000038 You’ll see a very simple list of objects, and you can even see the app (package Id) it belongs to (not if that is important though …): So – now you know how to find all objects and how to run objects. You can run tables , reports , queries and pages , simply by constructing the right URL (pretty much the same as explained here ). System/Virtual tables To find these special tables with system information, simply filter the “AllObj” table on “TableData” and scroll down to the system tables number range (ID range of 2.000.000.000 and above) and start browsing :-). You’ll see that you don’t always have permission to read the content .. but if you do, you’d be surprised of the data that you can get out of the system. Just a few pointers Session information https://businesscentral.dynamics.com/? table=2000000009 All Objects https://businesscentral.dynamics.com/? table=2000000038 Fields https://businesscentral.dynamics.com/? table=2000000041 License Permission https://businesscentral.dynamics.com/? table=2000000043 Key https://businesscentral.dynamics.com/? table=2000000063 Record link https://businesscentral.dynamics.com/? table=2000000068 API Webhook Subscription https://businesscentral.dynamics.com/? table=2000000095 API Webhook Notification https://businesscentral.dynamics.com/? table=2000000096 Active Session https://businesscentral.dynamics.com/? table=2000000110 Session Event https://businesscentral.dynamics.com/? table=2000000111 Table Metadata https://businesscentral.dynamics.com/? table=2000000136 Codeunit Metadata https://businesscentral.dynamics.com/? table=2000000137 Page Metadata https://businesscentral.dynamics.com/? table=2000000138 Event Subscription https://businesscentral.dynamics.com/? page=9510 What if I get an error? Well, that happens – like this one: I don’t know why it does that – but do know you can always turn to a developer, that can try to apply the C/AL trick: just create a page in an extension and add all fields from the table and simply run that page.

Forum Post: WebService consuming from a temporay table

$
0
0
Hi everyone, I've created a page, which its source table is temporary. The source table is one created by myself. The page is this: And the oData url is filtered, this way: http://xxxxxxxxxx:7068/DESA/ODataV4/Company('ZZZZ%20EUROPE%20S.L.U.')/AppItemPrice?$filter=Insize_User eq 'FI159' In the onOpenPage of the page, I call to a function that creates entries on the table. But when I add the URL into the brower, I'm not getting any result. I think that propably I0m not getting correctly the filter from the URL... Any hint about this?? THbak you all, hope you are all healthy!

Forum Post: RE: WebService consuming from a temporay table

$
0
0
I addet his to get the filter from the URL: But still get the result empty...

Blog Post: Azure Application Insights 101

$
0
0
In my series around Application Insights for Microsoft Dynamics Business Central / NAV this is probably the most booring one. However it is quite important. In order to teach you folks about KQL and the Application Insights API etc. Step 1 – Create Application Insights In your Azure Tenant search for Application Insights and select Add. There is not much to fill in here. The Resource Group is probably most important if you have a bigger Azure Tenant. You want to group your stuff together. Step 2 – Grab the key! After the resource is created grab the key to your clipboard and now leave the Azure Portal and move to the Business Central Admin Portal Step 3 – Put the key in Business Central and Restart your system Step 4 – Analyse the data But that’s for the next blog, about KQL. This will be a language at least 1 person in your company needs to master. Definately. Wait… is that all?? Essentially yes, but there is a caveat… The million dollar question is probably whether or not to pot multiple customers into one Application Insights resource. This probably depends on one question. Does your customer want to access the data? If they do, the data needs to be in it’s own AppInsights resource so you can grant your customer access. The good news is, and we’ll get to that, is that you can query accross application insights instances.

Forum Post: RE: WebService consuming from a temporay table

$
0
0
Data in temporary tables is not stored in the database, but in your session's memory. There is nothing for the webservice to show.

Forum Post: RE: WebService consuming from a temporay table

$
0
0
I've solved it passing the Rec to the function Thank you!

Blog Post: Setting up Azure SQL Analytics (Preview) – Dynamics NAV

$
0
0
Telemetry is everything, you cannot have enough data when users start asking you why the system is behaving differently than yesterday or performance is changing over time. This is where Azure SQL stands out from On Premises. You can get so much more data and in an easy way to analyse. However, you need to know where to find it because not everyting is setup automatically after you create a database. Some is, some is not. This blog is about how to connect Azure SQL Analytics to your Azure Monitor. The steps how to do this are described in this docs entry and I don’t want to repeat existing documentation. I will add some screenshots of some results for a 220 GB Microsoft Dynamics NAV database with 80 concurrent users. https://docs.microsoft.com/en-us/azure/azure-sql/database/metrics-diagnostic-telemetry-logging-streaming-export-configure?tabs=azure-portal Step 1 – Patience! After you have activated Azure SQL Analytics it will not be visible for a while. It takes time in the background to be generated and but together by the Microsoft Minions who control your tenant in the background. Remember that these Minions have labour contracts and a rights to have a break every now and then. Step 2 – Azure Monitor & More… When the Minions are finished the data will show up in Azure Monitor. Search for it in your environment And then, at least in my case you have to click on More… This should show a link to your Azure SQL Analysis. In my case with two databases. DEV and PROD. Step 3 – The Dashboard The first dashboard you’ll see is something like this, except for the fact that this shows data 24 hours after activation and we had a busy friday with a performance incident. I’ll get back to that. There are some interesting statistics here already visible like wait stats, deadlocks and autotuning. I’ll handle wait stats in this blog and maybe I’ll get back to deadlocks and autotuning later. There is a “good” reason the autotuning is red and I’ll look at that tomorrow (sunday) when nobody is working on the system. Step 4 – Drill Down | Database Waits If we drill down into the Database Waits we see more details on what types of waits we are dealing with here. It does not help looking at these waits without narrowing down into specific moments in time when “things go wrong” because specific events relate to specific wait stats and some waits are just there whether you like it or not. We all know CXPPACKET because NAV/Business Central fires a lot of simple queries to the Azure SQL engine resulting in CPU time wasted. There is not much you can do about that. (As far as I know). Step 5 – Houston we have a problem! It’s 3:51pm on friday afternoon when my teammate sends me a message on Skype that users are complaining about performance. Since we just turned on this great feature I decide to use it and see what goes wrong. We drill down again one more time and click on the graph showing the waits. Note that this screenshot was created a day after the incident but it clearly illustrates and confirms that “someting” is off around the time my teammate sent me a message. The wait time on LCK_M_U goes through the roof! We have a blocker in our company. Hey, this is KQL again! Now we are in a familiar screen, because this is the same logging that Business Central Application Insights is using. Drilling down into the graph actually generated a KQL query. Step 6 – What is causing my block? To see what query is causing my block I have to go back to the Azure Dashboard and click on Blocks like this From here we have two options. If I click on the database graph I get taken into the KQL editor and if I click on a specific block event I get a more UI like information screen. Let’s click on the latter. Step 7 – Get the Query Hash This is where it get’s nerdy. The next screen shows the blocking victim and the blocking process. It also shows a Query Hash. This is where I had to use google, but I learned that each “Ad-Hoc” query targetted against SQL Server gets logged internally with a Query Hash. Since NAV/Business Central only used Ad-Hoc queries we have a lot of them and it’s important to understand how to read them. What worries me a bit here is the Blocking Process’ Status which is sleeping. I have to investigate this more, but I interpret this as a process that went silent and the user is not actively doing something. Step 8 – Get the Query Using Google I (DuckDuckGo actually) also found a way to get these queries as long as they still exist in the cache of your SQL Server. Simply use this query SELECT deqs.query_hash , deqs.query_plan_hash , deqp.query_plan , dest.text FROM sys.dm_exec_query_stats AS deqs CROSS APPLY sys.dm_exec_query_plan(deqs.plan_handle) AS deqp CROSS APPLY sys.dm_exec_sql_text(deqs.sql_handle) AS dest WHERE deqs.query_hash = 0xB569219D4B1BE79E This will give you both the query and the execution plan. You have to use SQL Server Management studio to execute this against your Azure SQL Database Step 9 – Restart the service tier Unfortunately for me this journey resulted in having to restart the service tier. We could not identify the exact person/user who had executed the query that was locking. Maybe we will be able to do that in a future incident since I’m learning very fast how to use this stuff and time is off the most essence when incidents like this happen on production environments. Needless to say that the NAV Database Locks screen was not showing anything. I would have used that otherwise off course.

Blog Post: Automating the creation of work items in Azure DevOps (from Powershell and from Dynamics 365 Business Central)

$
0
0
If you’re using Azure DevOps in yor organization for the entire product lifecycle, you know for sure that you can handle also all your project management activities by using work items , boards , backlogs , sprints and so on. I’ve talked about that in the past here . What personally we use a lot internally is the Work Items feature. By using work items you can track anything, from a task to do, a feature to implement, an impediment, an epic and a bug. You can open work items inside a project and assign those items to your team members easily: The interesting part is that you can also commit code linked to a work item or to a bug for a complete tracking directly from Visual Studio Code or Visual Studio. I think that when you start using those features widely, the first thing that comes to your mind is: can I automate the creation of a task or a bug, or in general any work item? This can be useful expecially if you want to permit to external users to open bugs for example. Azure DevOps offers a set of REST APIs for automating the creation of things and for reading data from external applications. You can see the available APIs here . We’ve automated the creation of tasks and bugs (or more in general, the creation of work items) by using an external application. Here I want to share a solution for doing this that uses a simple Powershell script. To access the Azure DevOps REST APIs you need to be authenticated and so you need to create a Personal Access Token for your organization (top right corner, click on the user icon and select Personal access tokens ). For authentication, you need to send a Basic authentication header with the access token you have created to the url of your organization ( https://dev.azure.com/OrganizationName ). To create a work item, you need to send a POST request to the workitems API by passing a JSON body that defines your work item (JSON array). More details can be found here . The script to create a work item with the type you want is as follows: When executed, this is the output in Powershell: and these are my tasks and bugs created from Powershell in Azure DevOps: You can find the script here . Quite interesting if you want to automate work items creation from external events isn’t it? And what about if you want to create a system for opening tasks and bugs (or other work items) directly from Dynamics 365 Business Central? You can create an extension that permits you to insert a work item and then calls the Azure DevOps REST API. As an example, I’ve created an extension with a table that stores my opened work items in Dynamics 365 Business Central as follows: Then I’ve created a list page as a UI for opening a work item. On this page I have an action that takes all the work items not sent to Azure DevOps and then sends them to Azure DevOps by calling its REST API. Code for the page is omitted here (very easy to do): The business logic for calling the Azure DevOps REST API is defined in a codeunit as follows: What happens here? For each work item inserted into Dynamics 365 Business Central, a JSON representation is created, the REST API is called and the work item is created into the selected Azure DevOps project: This sample can obviously be improved. For example you should parse the API response (JSON) in order to retrieve the Azure DevOps ID for the work item and store this ID into the Dynamics 365 Business Central table for traceability and so on but I think you’ve learned now that you can automate your DevOps processes also from scripts or directly from your ERP.

Forum Post: Custom Extension License

$
0
0
Dears, I developed a module using extension V2 but I want to make a subscription for it on a monthly bases, like an expiration date for this module. Any idea how that can be done

Blog Post: Are you ready to move forward “WITH”-out AL?

$
0
0
Sometimes I just have to write my frustration away in order to clear my head. Don’t expect technical tips and tricks in this post, but maybe some inspiration. Today I was absolutely flabbergasted. Both on Twitter and on LinkedIn (I am a social media junky) there were actually threads about Microsoft removing the WITH statement in AL. I was litterally like OMG! Go spend your time on the future!! https://github.com/microsoft/AL/issues/5817#issuecomment-617004754 I’m not going to spend more time on this idiotic topic than this. AL is a horrible programming language and in my future programming career I expect to spend less and less time each year using it. What does your toolbox look like? My father-in-law, may he rest in piece, could litterly make anything with his hands. He was a carpenter as a proffession but he could paint, masonry, plastering, pave roads, you name it and he could do it as long as he has the right tools, a good mindset and look at someone do it for a while to pick up some tricks. As programmers we seem to be married into languages and frameworks and I can only guess why this is the case. In the old world were we came from which was called “On Premises” it was hard to have multiple frameworks, operating systems and databases work side-by-side. THIS IS NO LONGER TRUE!!! WAKE THE F*CK UP!! We live in a new world called cloud, preferably the Microsoft Azure cloud and in this new world frameworks, databases and programming languages co-exist side-by-side just fine. Not C/Side is your toolbox but Azure is! How I am migrating our 200GB+ Database to Business Central with 2000 custom objects? BY USING AZURE!!!!! – Mark Brummel – Quote me on that. For the last year or so I’ve been preparing “our” Business Central SAAS migration and the first thing I did was NOT look at AL code and extensions. The first thing I did was to implement Azure Blob Storage. The second thing I’ve implemented was Azure Functions replacing C/AL code with C# code. The third thing I’ve implemented was JavaScript Add-Ins to work around limitations of the Web Client. I did this together with the fantastic team of Global Mediator which gave birth to a product called Meta UI which for those of you not to stuborn to “want to do it themselves” make the Web Client a fantastic place to live in. Number four on my list was Logic Apps to replace Job Queue processes scanning for new files and enhance our EDI Right now we are implementing Cosmos Database, with Logic Apps and custom API to reduce our database size and improve scalability of our Power BI FIVE PROJECTS to move to Business Central SAAS WITHOUT a single line of AL code written and we started our project about 18 months ago. The plan is to move to Business Central SAAS within the next 24 monhts with as few AL customisations as possible. You know what is funny? The things we are moving OUT of Business Central are the things that make us agile. These are the things that we always have to make ad-hoc changes to why we love C/Side so much. Please implement a new EDI Interface. Boom, done. With Logic Apps and an Azure Function. Please change this KPI. Boom, done with Power BI. Please make this change to the UI. Boom, done with Meta UI. Oh, and off-course to not forget my friends in Denmark. Please change the layout of this report. Boom, done with ForNAV! My frustration is probably not gone, it won’t be gone as long as I read people on the internet still treating AL as if it were C/AL WHICH IT IS NOT! Fortunately I have a fantastic new job at QBS which allows me to evangalise thinking out of the box and helping people get started with Azure. Only last week in a few hours I got a partner up and running with an Azure Tenant running Business Central on a scalable infrastructure to run performance tests.

Blog Post: Dynamics 365 Business Central: obsoleting the WITH statement

$
0
0
As you ever used the WITH clause in C/AL or AL language? There are lovers of this clause and there are developers that hate using this clause (I’m in this second category), but despite every personal opinion the WITH clause is widely used in code and also in the Microsof’s Base Application. As announced at the Dynamics 365 Business Central Virtual Event, Microsoft is continuously evolving the platform and their apps (today and in the future) and they need to be able to upgrade tenants smoothly while trying to not break our extensions. The WITH clause can help you on writing a more readable code in certain situations (I don’t think so but that was the goal) but it creates a lot of un-efficiency related to code compilation. For avoiding this, Microsoft is planning to making the WITH clause obsolete and you should start forgot it (please don’t use it anymore!) and start fixing your apps accordingly. When using WITH, in AL (or C/AL too) there are two types of usage: explicit WITH and implicit WITH. Explicit WITH is something like in the following example: Here I have a codeunit with a ProcessCustomer method that uses WITH to reference the Customer record. The procedure calls a local method called IsReady to determine if the Customer record can be inserted into the database. What happens now if another extension (or Microsoft itself) will add a new method called IsReady in the Customer table? Something like the following: The problem here is related to symbols lookup. To resolve symbols, the compiler checks the following scopes (in the following order): “With” scope: Customer and Customer extension members, Record class members Method scope: Parameters and variables Object scope: global variables and methods, fields, base class members Global scope: enums, built-in methods and classes In this way, your code here can have an unpredictable result (the IsReady() method declared in the tableextension wins and brokes your code because you can have totally different results than what you want to achieve in your local IsReady() method). And if (for example) the IsReady() method declared in the tableextension requires a parameter, your code will not compile anymore and your extension is totally broken (you have a compilation error now): This is the reason while Microsoft in the future AL language extension will signal you a warning in code if you’re using the WITH clause: and fortunately you will have code actions for fixing these problems: taht when clicked will transform your code as follows: While fixing the explicit WITH could be quite easy, the implicit WITH is instead more difficult to catch and it’s something like in the following example. Imagine that I have a codeunit that works on the Customer table: Here the implicit WITH could be fixed by prefixing Rec. and you can use AL code actions to do that: If I click on the code action for fixing the implicit WITH, you can see that the code is fixed by appending Rec. to the methods, but the IsReady() method that I want to use here is not prefixed (because it’s declared locally): And if someone (a third-party extension or Microsoft itself) will add an IsReady() method to the Customer table in the future? For solving this problem, the new code action for resolving the implicit with will add also a pragma declaration for the compiler ( #pragma implicitwith disable ): In this way the compiler will not resolve the IsReady() method by going to the previously explained chain and it will always found the local declaration. The usage of implicit WITH must be fixed also on pages, where you have a field reference. All fields on a page must be fixed with Rec.YourField : and you will have code actions for fixing this quickly: You can also disable the Microsoft warnings related to the explicit (AL0606) or implicit (AL0604) WITH usage by using the following directive in your code (.al file): #pragma warning disable AL0604,AL0606 And you can also disable these warnings globally by adding this declaration in your app.json file: “suppressWarnings”: [ “AL0604”, “AL0606”] but personally this is not what I would recommend you to do. Common questions that you can have about this: Will I receive a warning starting from tomorrow about WITH usage? No, this is planned for the Wave 2 2020 release (5 months later on) Should I start fixing my apps now? I think the answer is yes, start avoiding the WITH usage from today and start fixing the implicit with from today if you can. When the WITH usage will be transformed from a warning to an error? At least one year later, so Wave 2 2021 at least. Please don’t wait too much and start fixing your apps by avoiding the WITH usage starting from today.
Viewing all 11285 articles
Browse latest View live