Quantcast
Channel: Dynamics 365 Business Central/NAV User Group
Viewing all 11285 articles
Browse latest View live

Forum Post: calculate amount filtering off some records

$
0
0
Hello, I'm doing a report on 'Item ledger entry'. It's not a problem to do CALCFIELDS on amount to show items by customer or vendor with Sales or Purchase amounts using e.g. Vendor, 'Item Ledger Entry, and temp table with Integer loop. it's showing amounts for purchasing with item charges but I'd need it without it. I'm trying to use 'Value Entry' filtering it with 'ValueEntry.SETRANGE("Item Charge No.",''); and ValueEntry.SETRANGE("Item Ledger Entry No.",ItemLedgerEntry."Entry No."); and CULCSUMS. If I run it in a separate codeunit filtering ILE with an Item and 'Posting Date' it's working. But, it didn't work inside the integer loop. The codeunit code as below is working: recItemLedgerEntry.SETRANGE(recItemLedgerEntry."Item No.",'1000'); EVALUATE(PostingDate,'2019-01-24'); recItemLedgerEntry.SETRANGE(recItemLedgerEntry."Posting Date",PostingDate); EntryNo :=0; CLEAR(recValueEntry); recValueEntry.SETRANGE("Item Charge No.",''); IF recItemLedgerEntry.FINDSET THEN BEGIN REPEAT EntryNo := recItemLedgerEntry."Entry No."; recValueEntry.SETRANGE("Item Ledger Entry No.",EntryNo); MESSAGE(FORMAT(recItemLedgerEntry."Entry No.")); IF recValueEntry.FINDSET THEN BEGIN recValueEntry.CALCSUMS("Purchase Amount (Actual)"); PurchaseAmount2 += recValueEntry."Purchase Amount (Actual)"; END; UNTIL recItemLedgerEntry.NEXT = 0; END; But, when I'm trying to use/insert it in Vendor Item Ledger Entry Integer It is not working. Thanks

Forum Post: calculate amounts filtering off some records

$
0
0
Hello, I'm doing a report on 'Item ledger entry'. It's not a problem to do CALCFIELDS on amount to show items by customer or vendor with Sales or Purchase amounts using e.g. Vendor, 'Item Ledger Entry, and temp table with Integer loop. it's showing amounts for purchasing with item charges but I'd need it without it. I'm trying to use 'Value Entry' filtering it with 'ValueEntry.SETRANGE("Item Charge No.",''); and ValueEntry.SETRANGE("Item Ledger Entry No.",ItemLedgerEntry."Entry No."); and CULCSUMS. If I run it in a separate codeunit filtering ILE with an Item and 'Posting Date' it's working. But, it didn't work inside the integer loop. The codeunit code as below is working: recItemLedgerEntry.SETRANGE(recItemLedgerEntry."Item No.",'1000'); EVALUATE(PostingDate,'2019-01-24'); recItemLedgerEntry.SETRANGE(recItemLedgerEntry."Posting Date",PostingDate); EntryNo :=0; CLEAR(recValueEntry); recValueEntry.SETRANGE("Item Charge No.",''); IF recItemLedgerEntry.FINDSET THEN BEGIN REPEAT EntryNo := recItemLedgerEntry."Entry No."; recValueEntry.SETRANGE("Item Ledger Entry No.",EntryNo); MESSAGE(FORMAT(recItemLedgerEntry."Entry No.")); IF recValueEntry.FINDSET THEN BEGIN recValueEntry.CALCSUMS("Purchase Amount (Actual)"); PurchaseAmount2 += recValueEntry."Purchase Amount (Actual)"; END; UNTIL recItemLedgerEntry.NEXT = 0; END; But, when I'm trying to use/insert it in Vendor Item Ledger Entry Integer It is not working. Thanks

Forum Post: RE: calculate amounts filtering off some records

$
0
0
My apology as for some reason it was posted twice. It's needed to be deleted.

Blog Post: Importance of Testing

$
0
0
Overview So, you’ve decided to switch to Microsoft Dynamics 365 Business Central and finished the hard work of development, migration, and training. Now you’re ready to go live, right? But, I’ll bet there’s one thing you’ve unknowingly neglected. That’s testing the new software. After the implementer tells you that they’re done with the development and migration, the first thing you need to do is to run testing. Testing might sound simple, all you need to do is run through a couple of scenarios, and you’re good, right? I wish it were as simple as that, but often it isn’t. After the implementer hands you the keys to the software, testing is an investment you and your team need to take seriously. Doing that will set you up for future success. Testing is essential because it allows the users to: Get familiar with the new system Find potential problems Discover unknown features Overcome the fear of change Establish confidence in your team Make Sure Users Understand the Importance of Testing It’s no secret: people are much more likely to take advice from people they trust. The first step to implement an effective testing process is to build trust with the people you work with – and more specifically, the ones who are going to use the new system. Even though we always emphasize on the importance of testing, most users won’t follow through with proper testing, even if they promised they would. Why? There are many reasons. Often, the users are not aware of how important testing is. More often, however, they are just busy dealing with daily challenges and crises at work. This is why building trust with the company is key, and also having a clear and easy process ready for them. You, as the implementer, should be the subject-matter expert, but you also should see yourself as a mentor to managers, to help them help their employees. Again and again, you need to explain the benefits of testing and that testing will benefit the user and their team. More importantly, emphasize on the fact that it’s not to waste their time; but rather is an investment that pays long-term dividends. Time is one of the biggest reasons users will avoid testing, but time is also one of the reasons testing is so important. Some of the main benefits that come from testing are to save time. Becoming familiar with the new system, solving any problems, and discovering all of the features will save users time in the future. It’s essential to explain that while testing requires time upfront, it will save them precious time in the future. Work With The People in Charge The first important way of implementing a successful testing operation is to know your place: You’re not the one in charge, but you’re there to make sure that those who are thrive. As a consultant, you’re not an employee but an advisor. The key to success is communication. You need to make sure that managers understand that you’re here to support their success; not to do everything for them. You need to set the stage for them to have all the necessary keys in their pockets so that whenever a problem occurs, they know what to do. As it is the manager’s responsibility to oversee the testing, your job is to make sure that the testing is done properly and more importantly, they don’t cut corners. You also need the manager to be aware of how important test runs are. If the management team of the company asks you to manage their people as if you were their boss, one thing you need to do is to get the customer’s managers to take the lead. In order to make sure you don’t get a phone call every day with questions, you should have a plan that includes milestones for the use to be complete – so the testing process is faster and easier because the users have clear directions to follow. A good and simple plan will also give the managers something to look for and help them communicate with their team. Again, explain the benefits of testing to the managers, and make it clear that in the long term, the company is going to save time and money by making sure everything is running properly. Don’t forget: the best offense is a good defence. As with anything new, it will take time for everyone to grow comfortable with the new system. It won’t happen overnight, but if everything is done strategically and in an organized manner, the risks of encountering problems in the future are going to be reduced and your clients are going to save precious time. By doing so, your clients will be more likely to recommend the system to other companies. Make Sure The Process Is Easy As an implementer, part of your job is to make the customer test the system, and more importantly, make the testing easy. You can’t, and shouldn’t, sit and look over their shoulder and watch them run through different tests. That would be a waste of your time because that’s not what you are being paid for, and you want the users to run tests in their natural environment. The users often forget that testing requires a lot of extra time and effort, on top of their regular workload; which is why you need to ease the process for them as much as you can. The implementers can’t just mandate training. That will only make people want you to leave faster. Instead, provide them a clear step-by-step testing process that they can implement. To Sum It Up If implementers don’t do their job correctly, the testing won’t be successful. Proper testing will benefit the users for years to come and make the new system much more enjoyable to use. If the user is running into error after error, or they aren’t able to find the features they need in the new system, it will cause frustration, and make the system – or worse, you – look bad. When the implementer does his/her job correctly, the transition from implementation to testing and regular use should be smooth. The users will still need to build confidence in the system and may encounter some problems, but testing will help prevent many unnecessary struggles with the system.

Blog Post: How to upgrade AppSource/Marketplace Apps without uninstalling and installing again

$
0
0
With latest Business Central Admin center, we can check if there are any latest version of app available in AppSource/Marketplace. 1. Go to Business Central Admin Center 2. Click on environment that you want to check for updates and click on it (ex: Production) 3. Click on Manage Apps 4. Here we can see if there is any new version available in “Latest Available Version” field and option install in “Available Update Action” 5. In below example screenshot, we can notice that there is a new version 5.3.0.1 available Progressus Software and we can install by clicking “Install Update” 6. We get confirm notification when we click on “Install Update” 7. Update will be scheduled once we click on Yes 8. Will start updating 9. And finally will be updated

Forum Post: RE: calculate amounts filtering off some records

$
0
0
It's actually solved through creating a function.

Forum Post: RE: calculate amount filtering off some records

$
0
0
It's actually solved through creating a function.

Forum Post: RE: calculate amount filtering off some records

$
0
0
It's actually solved through creating a function.

Forum Post: RE: calculate amount filtering off some records

$
0
0
It's actually solved through creating a function.

Blog Post: Creating a Dynamics 365 Business Central Docker container with artifacts on Azure Container Instances

$
0
0
Azure Container Instances (ACI) is a great service offered by Azure that permits you to run serverless Docker containers in Azure with simplicity and speed, without having to provision or manage any underlying infrastructure. I think that this service can be extremely useful when working with Dynamics 365 Business Central too, because you can spin up a container in few minutes with all the power you need, use it as needed and then delete it, all with some simple scripts and without provisioning any Virtual Machines in the cloud or installing software. More than 2 years ago I wrote a post for explaining how to use ACI for creating a Dynamics 365 Business Central container. Now Microsoft has recently changed the way on how to create Docker containers for Dynamics 365 Business Central and the reference for this topic is obviously the Freddy’s blog . Basically, to have a more manageable solution, Microsoft will stop producing docker images and instead they will publish builds as artifacts, which can be used together with the generic docker image to run the image you want. To create a Docker container with Dynamics 365 Business Central we now need to have two things: the base image to use (accordingly to the host’s OS). the artifacts to download If you use NavContainerHelper module all is simple and well explained on the Freddy’s site. For the base image name, NavContainerHelper has a function called Get-BestGenericImageName that gives you the best generic image to use for your OS. But how to do that on Azure Container Instances? The first step that to do we have is to find the base image name to use. On ACI, supported OS versions are Windows Server 2016 or Windows Server 2019. You can have a list of all the available generic images with the following NavContainerHelper command: $imagetags = Get-NavContainerImageTags -imageName "mcr.microsoft.com/dynamicsnav" For example, Windows Server 2019, Version 1809 is version 10.0.17763 and from the above list you can use the generic image called mcr.microsoft.com/dynamicsnav:10.0.17763.973-generic . The second step is to find the artifact url for the container that we want to create. Here I want to create a Docker container on ACI with Dynamics 365 Business Central 16.2 IT, so I execute the following command: $artifactUrl = Get-BCArtifactUrl -version 16.2 -country it -select Latest and this gives me the following artifact url: https://bcartifacts.azureedge.net/sandbox/16.2.13509.14256/it We’re now ready to create our container on ACI. The command to use (after installing Azure CLI) is the following: $imageName = "mcr.microsoft.com/dynamicsnav:10.0.17763.973-generic"; $resourceGroup = "d365bcacirg"; $location = "westeurope" $containerName = "d365bcaci" $dnsName = "d365bcaci.westeurope.azurecontainer.io" $artifactUrl = Get-BCArtifactUrl -version 16.2 -country it -select Latest az group create --name $resourceGroup --location $location az container create -g $resourceGroup -n $containerName --image $imageName --os-type Windows --cpu 2 --memory 16 --ip-address public -e artifactUrl=$artifactUrl ACCEPT_EULA=Y USESSL=N ClickOnce=Y publicDnsName=$dnsName --dns-name-label $containerName ` --ports 80 7046 7047 7048 7049 8080 where the artifactUrl parameter is passed as an environment variable. Wait some minutes (at least 15/20 minutes usually) and your container is created on Azure Container Instances: Please remember that when the ACI instance is provisioned, the Dynamics 365 Business Central instance is not immediately available: If you inspect the container’s log with the following command: az container logs --resource-group $resourceGroup --name $containerName you will see that the instance creation requires a bit of extra time (downloading and unpacking application and platform artifacts, copying installation files, installing components and so on). When the instance is available, the container’s log shows you the credentials and the public URL of your container and you can start using it: When using ACI, remember that Azure Container Instances compute, memory, and storage resources availability differs per Azure region. Please always check the following link to see the limitations and to avoid errors on creation: https://docs.microsoft.com/en-us/azure/container-instances/container-instances-region-availability As an example, if you want to create a Dynamics 365 Business Central container on ACI with 64 GB of RAM and Windows Server 2019, this is not supported and you can receive an error like: The requested resource with ‘x’ CPU and ‘y.z’ GB memory is not available in the location ‘example region’ at this moment. Please retry with a different resource request or in another location. When using ACI instead of a classic Docker installation (local on on Azure VM)? When you quicky need to deploy a container available everywhere and without provisioning things and this container has a short life (few hours/days). I don’t recommend ACI for long running containers because it could cost you a lot. Last thing: I think you will never have this “problem”, but in my real scenario here I had an Azure Function that returned the artifact url to use for spinning up containers on ACI automatically when needed. There was a moment where this function returned a BLANK artifact url. In this case, the container was created correctly on ACI, but if inspecting the log the result was this: You must share a DVD folder to C:\NAVDVD or a file system to C:\NAVFS in order to run the generic image If you have this error, please check your artifact url (thanks Freddy). Enjoy Dynamics 365 Business Central on ACI

Blog Post: About ISV’s, Being Stubborn and Flexibility

$
0
0
Let’s start with a short story. The year is 2014 and the world was spinning as it did until March this year with mass tourism and in person events. With the release of NAV 2013R2 and later 2015 our community was just starting to embrace the three tier concept and the Role Tailored Client. Nobody had heard of events or extensions. The economy is booming and everyone is busy not worrying about the future. In that year I first did a small project for Datamasons to connect their EDI solution to Dynamics NAV using Web Services. Lateron I would do a similar project helping the folks of Dynamics TMS connecting their solution to NAV using an architecture that was as decoupled as possible and easy to upgrade. When these ISV’s asked me to publicly endorse their solutions I told them that I would endorse the decoupled architecture and promote the idea of using best of bread solutions that interface with NAV rather than doing everything in NAV & C/Side. This was not the first time I called the wrathfulness of an ISV in our ecosystem to turn against me but it was the first time it went quite big and ugly. It happed at the NAVUG summit and it created some tention around the event for those involved. The reason for writing this blog now and reflecting against something that happened five years ago is that this week several events happened that made me think about that NAVUG event several times. If Your Only Tool Is a Hammer Then Every Problem Looks Like a Nail I would repeat myself too much if I would start talking again about C/Side and the habbit of our community to use it as a single tool to solve all problems. It’s in-line with the ERP heritage from the late 1980’s and 1990’s when interfacing essentially was non existing, the internet had yet to be invented/adopted and infrastructure was hard to maintain and share. The large ISV solutions that we have in our ecosystem are all born in the same era and back in the days these were founded by young people (most often or always guys) in their garage working long hours to establish their brand. Today most of them are in their 50’s or early 60’s worrying more about their legacy than they do about the future. Back then I was just a bit too young to join that party which leaves in in the middle with no legacy to worry about and an open mind into the future. It’s a cloud-connected world Today we live in a connected world in which it has never been easier to open your application and share data and processes accross platforms and geography. Microsoft did a fantastic job with Azure on the one side as the leading cloud platform for serverless applications and Business Cental and the PowerPlatform/CDS as cloud ready frameworks to build system applications. With Business Central it has never been easier to design an open architecture that allows you as an ISV to keep your solution small and manageable while allowing your partners to handle edge cases by subscribing to events or exchanging data using the API. The Problem For some reason, and I don’t really understand why, it looks like the larger ISV’s are not open to use this opportunity. Many ISV’s have monolith applications that require “fob” files with thousands of objects to be inserted into your system. The reason for these monolith applications is the fact that they all try to solve all problems with the same software. This is no longer nessesairy in the cloud world where you can break your application into multiple smaller components to start with, but you can also leverage the Azure stack to move parts of your application to Power Platform/CDS or even Cosmos, Docker, Microservice API’s etc. The Solution Time. That’s the only answer I can think of. Given enough time we will see what happens and who winns. If I had to place a bet I would avoid the majority of the horizonal solutions on AppSource that have a tight connection to AL. Instead I would bet on those that have a decoupled architecture and allow their software to be seamlessly connected into anything that understands Odata Query Language and HTML5.

Blog Post: About ISV’s, Being Stubborn and Flexibility

$
0
0
Let’s start with a short story. The year is 2014 and the world was spinning as it did until March this year with mass tourism and in person events. With the release of NAV 2013R2 and later 2015 our community was just starting to embrace the three tier concept and the Role Tailored Client. Nobody had heard of events or extensions. The economy is booming and everyone is busy not worrying about the future. In that year I first did a small project for Datamasons to connect their EDI solution to Dynamics NAV using Web Services. Lateron I would do a similar project helping the folks of Dynamics TMS connecting their solution to NAV using an architecture that was as decoupled as possible and easy to upgrade. When these ISV’s asked me to publicly endorse their solutions I told them that I would endorse the decoupled architecture and promote the idea of using best of bread solutions that interface with NAV rather than doing everything in NAV & C/Side. This was not the first time I called the wrathfulness of an ISV in our ecosystem to turn against me but it was the first time it went quite big and ugly. It happed at the NAVUG summit and it created some tention around the event for those involved. The reason for writing this blog now and reflecting against something that happened five years ago is that this week several events happened that made me think about that NAVUG event several times. If Your Only Tool Is a Hammer Then Every Problem Looks Like a Nail I would repeat myself too much if I would start talking again about C/Side and the habbit of our community to use it as a single tool to solve all problems. It’s in-line with the ERP heritage from the late 1980’s and 1990’s when interfacing essentially was non existing, the internet had yet to be invented/adopted and infrastructure was hard to maintain and share. The large ISV solutions that we have in our ecosystem are all born in the same era and back in the days these were founded by young people (most often or always guys) in their garage working long hours to establish their brand. Today most of them are in their 50’s or early 60’s worrying more about their legacy than they do about the future. Back then I was just a bit too young to join that party which leaves in in the middle with no legacy to worry about and an open mind into the future. It’s a cloud-connected world Today we live in a connected world in which it has never been easier to open your application and share data and processes accross platforms and geography. Microsoft did a fantastic job with Azure on the one side as the leading cloud platform for serverless applications and Business Cental and the PowerPlatform/CDS as cloud ready frameworks to build system applications. With Business Central it has never been easier to design an open architecture that allows you as an ISV to keep your solution small and manageable while allowing your partners to handle edge cases by subscribing to events or exchanging data using the API. The Problem For some reason, and I don’t really understand why, it looks like the larger ISV’s are not open to use this opportunity. Many ISV’s have monolith applications that require “fob” files with thousands of objects to be inserted into your system. The reason for these monolith applications is the fact that they all try to solve all problems with the same software. This is no longer nessesairy in the cloud world where you can break your application into multiple smaller components to start with, but you can also leverage the Azure stack to move parts of your application to Power Platform/CDS or even Cosmos, Docker, Microservice API’s etc. The Solution Time. That’s the only answer I can think of. Given enough time we will see what happens and who winns. If I had to place a bet I would avoid the majority of the horizonal solutions on AppSource that have a tight connection to AL. Instead I would bet on those that have a decoupled architecture and allow their software to be seamlessly connected into anything that understands Odata Query Language and HTML5.

Blog Post: Test Automation Examples - Example Two and Categories

$
0
0
CODE COMPLETE! Yesss. This was waiting somewhat too long on my desk to be completed, but this morning I finally did: Extended Text on Assembly Documents . This example extends the standard Extended Text feature in Business Central to assembly documents. As with the previous two completed examples ( Blocking Deletion of Warehouse Shipment Lines and Automatically Set Posting Period on GL Setup and User ) you will find a full collection of ATDD scenarios in Excel and PowerShell files, that were used to let the ATDD.TestScriptor PowerShell module efficiently create test codeunits. Categories Now with code complete on all current examples, I also started to add various categories to them to describe at overview level what each example contains. This will allowi you to find what you are looking for without needing to dive into the details first. These are the categories I have added so far: ATDD Sheet This denotes that an ATDD Excel sheet is present in the test project ATDD.TestScriptor (PowerShell) This denotes that ATDD scenarios are present in PowerShell format that can be converted to AL test codeunits using the ATDD.TestScriptor PowerShell module ConfirmHandler This denotes that some tests do use a ConfirmHandler Enqueue This denotes that variable enqueueing is used in some tests Fresh Fixture This denotes that fresh fixture is created in tests Library This denotes that helper functions have been aggregated into test libraries Report (ProcessingOnly) This denotes that some tests make use of a processing only report RequetPageHandler This denotes that some tests do use a RequetPageHandler Scenario Variations This denotes that this example has a high number of variations of similar tests Shared Fixture This denotes that shared fixture is created in some tests TestPage This denotes that some tests make use of a TestPage object And I have also added the following categories to point out to what functional area the example feature belongs: Assembly Financial Management Warehouse Management Notes Might you have any suggestion (or question) for improving existing and/or adding new examples, be invited to record an issue in the repository Feel free to suggest categories and where to apply them here

Forum Post: Default dimension value not updating in journal line when created through Configuration Package

$
0
0
Hi All, In the GL account card, I set PROJECT dimension value TOYOTA as default dimension with SAME CODE. I used configutation package to create a journal, in the configuration package i have not included the dimension PROJECT using "Dimension as column" feature. Becasue since it is a default dimension value, not required to even keep that project dimension column. In configuration package, after do import from excel, then done apply package. In journal, that default dimension value TOYOTA is not coming. Pleae advise. Note: i tested included the dimension PROJECT column also in configuration using "Dimension as column" feature, still not working Thanks Vinoth

Blog Post: Using Azure Logic Apps for creating a Dynamics 365 Business Central container on Azure Container Instances

$
0
0
A week ago I’ve written a post explaining how to create a Dynamics 365 Business Central Docker container on Azure Container Instances (ACI) by using the new artifact way . ACI permits you to have “container as a service” and so you can rapidly create and launch containerized applications in a serverless scenario. And when going totally serverless, a task that you could have the need to handle is the ability to spin-up a Docker container based on events that can occour in the cloud or on other systems. As an example, imagine that when you insert a Customer record on your internal ERP system (obviously, it’s Dynamics 365 Business Central ) and you trigger an action called “ Create test environment for this Customer “, a new Dynamics 365 Business Central Docker Container must be created on Azure Container Instances. This container must be created without deploying scripts or other things. How you can do that? The answer can be: use Azure Logic App ! You can create an action inside D365BC where you send a POST request to the Logic app endpoint using the AL HttpClient object by passing a JSON body with the parameters that defines the container you need to create. With Azure Logic Apps and the Azure Container Instance connector , you can set up automated tasks and workflows that deploy and manage  container groups . The Container Instance connector supports the following actions: Create or delete a container group Get the properties of a container group Get a list of container groups Get the logs of a container instance To handle this scenario, I will create an HTTP-triggered Azure Logic App. This Logic App will receive a JSON object that contains the parameters for defining the Dynamics 365 Business Central Docker container to create, like the following: { "image": "mcr.microsoft.com/dynamicsnav:10.0.17763.973-generic", "artifactUrl": "https://bcartifacts.azureedge.net/sandbox/16.2.13509.14256/it", "resourceGroup": "d365bcacirg", "containerName": "d365bcdev", "containerGroup": "d365bcacigroup", "cpu": 2, "memory": 4, "dnsName": "d365bcdev.westeurope.azurecontainer.io" } When the HTTP request is received (with the above JSON body) from the Logic App endpoint, a new Create or update a container group action is triggered. This action receives the parameter from the body and must be configured in order to be able to create the Dynamics 365 Business Central container. The configuration is quite tricky and here I want to explain the parameters that must be configured to successfully create a Dynamics 365 Business Centra ACI container. As a first step, we need to configure the basic parameters (like Azure subscription, resource group, location, container name, container’s image and resources): Then, the other required parameters that must be configured are the environment variables required for the creation of a Dynamics 365 Business Central container: Here I’m passing the artifactUrl variable (received from the JSON) because it’s needed in order to create the container (download the artifacts) and also ACEPT_EULA = Y and USESSL = N. We need also to add parameters like Container Port Number and Container Port Protocol in order to expose the needed conatiner’s ports: Then, you need to specify the container’s OS type (Windows), the DNS name and you need to add the Port Number parameters (for port mapping): Last needed parameters are the Container Group IP address type and the ContainerGroup SKU. The container’s creation on ACI requires a bit of time, so I’ve added an Until loop that waits until the Container Group state is equal to Succeeded : Save the Azure Logic App. What happens now? If you send a POST http request to the Logic App endpoint by passing a JSON body like described above, you receive this response: and the workflow defined in the Logic App is triggered: If the Logic App is succeded, your ACI instance is provisioned: and your Dynamics 365 Business Central Docker container spinned up by the Azure Logic app is ready to go: Clean, low-code and fully serverless

Blog Post: Dynamics 365 Business Central 2020 Wave 2 release plan: my favourite features

$
0
0
Microsoft has shared today to the general public the Dynamics 365 Wave 2 Release plan . These are my favourite Dynamics 365 Business Central new features that will be released in the Wave 2 2020 period (October 2020 to May 2021): Improved overview and management of the available database and file capacity : tenant administrators will be able to have an overview of the database and file capacity (total size and currently used) to better control the size of the environments and purchase additional capacity in time for when it is needed. Customers will be able to go beyond the current 80 GB database limit by purchasing additional capacity. Service to service authentication for Automation APIs : a new application permission scope is added, called Automation.ReadWrite.All . This will allow service-to-service authentication, having external services connect as an application without impersonating normal users. Using OAuth Client Credentials flow, an app token with Automation.ReadWrite.All scope can then be used to access Business Central. Unlimited number of Production and Sandbox environments : a customer will be able to purchase additional production environments (> 3). For each newly purchased production environment, it can also create additional sandbox environments (number to be decided). Database access intent changed to read-only for frequenly used reports : I’ve talked a lot about the read-only replica on a SaaS environment and how you can use it to improve performances. Now also Microsoft will change the default behaviour of many standard reports in order to use the read-only replica instead of the production db. Introduction of a new Company Hub : customers has often the need to have access to multiple companies, either in the same tenant or in another different tenant environment. Business Central Company Hub gives you a list of the companies you work in. You can easily add new companies by just providing a URL and a name for the company. The list of companies contains a few KPIs for the company that is displayed for user if they have the needed access. You also have a list of assigned user tasks for a given company. It’s possible to run selected Excel reports for the company from the Company Hub too. Accountant Hub will be discontinued . Integration with Microsoft Teams : bring Business Central data into Microsoft Teams conversations to make decisions faster as a team. Improving the Microsoft Power Platform connectors : supporto for header and line entities, filtering, search, CDS virtual entities, PowerBI queries on read-only reploica and more. Optimized loading for pages with FactBoxes : content on the hosting page displays first, followed by any visible FactBoxes in the order in which they are shown on the page. FactBoxes continue to run within the same session unless a developer has explicitly implemented a Page Background Task for a FactBox. If the FactBox pane is collapsed, no FactBoxes are run upon opening the page. Instead, they are run on-demand when the FactBox pane is expanded. Data audit system fields added on every table : 4 new system fields ( SystemLastModifiedOn , SystemLastModifiedBy , SystemCreatedBy , SystemCreatedOn ) will be added in order to give to developers an easy and performant way to program against historical data. On-demand joining of companion tables : quite hidden feature at the moment for improving the performances with data that comes from a base table + extensions tables. Possibility to rename environments and restore an environment (sandbox or production) to a selected point in time up to 30 days in the past. Attach to user session when debugging in sandbox : this feature enables attaching the debugger to an active user session on the sandbox. Debug extension installation and upgrade code : with this feature you can set breakpoints in install or upgrade code, attach and trigger publishing of an extension to debug install or upgrade code. Developers can emit telemetry to Application Insights from AL code natively : simply use the new Session.LogMessage function to send your custom signals to Application Insights from your extension’s code. Extension publishers can get telemetry in Azure Application Insights : you can now add an instrumentation key for Azure Application Insights in the app.json file for an extension and when certain events occours in your extension (long running queries, report execution, extension updates, update errors, web service requests), these are automatically sent to Application Insights. To know more and see other plans for the Dynamics 365 product line, check this link .

Blog Post: Installing a DevOps Agent (with Docker) with the most chance of success

$
0
0
You might have read my previous blog on DevOps build agents . Since then, I’ve been quite busy with DevOps – and especially with ALOps . And I had to conclude that one big bottleneck keeps being the same: a decent (stable) installation of a DevOps Build server that supports Docker with the images from Microsoft. Or in many cases: a decent build agent that supports Docker – not even having to do anything with the images from Microsoft. You probably have read about Microsoft’s new approach on providing images being: Microsoft is not going to provide you any images any more, but will help you in creating your own images – all with navcontainerhelper. The underlying reason is actually exactly the same: something needed to change to make “working with BC on Docker” more stable. Back to Build Agents In many support cases, I had to refer back to the one solution: “run your Docker images with Hyper-V isolation”. While that solved the majority of the problems (anything regarding alc.exe (compile) and finsql.exe (import objects)) .. in some cases, it wasn’t solving anything, which only has one conclusion: it’s your infrastructure: version of windows and/or how you installed everything. So .. that made me conclude that it might be interesting to share with you a workflow that – in some perspective doesn’t make any sense – but does solve the majority of the unexplainable problems with using Docker on a Build Server for AL Development :-). Step 1 – Install Windows Server 2019 We have best results with Windows Server 2019 as it’s more stable, and is able to use the smaller images for Docker. Step 2 – Full windows updates Very important: don’t combine docker/windows updates and such. First, install ALL windows updates and then reboot the server. Don’t forget to reboot the server after ONLY installing all Windows updates. Step 3 – Install the necessary windows features So, all windows updates have applied and you have restarted – time to add the components that are necessary for Docker. With this PowerShell script, you can do just that: Install-WindowsFeature Hyper-V, Containers -Restart You see – again, you need to restart after you did this! Very important! Step 4 – Install Docker You can also install Docker with a script: Install-PackageProvider -Name NuGet -MinimumVersion 2.8.5.201 -Confirm:$false -Force Install-Module DockerProvider -Confirm:$false -Force Install-Package Docker -RequiredVersion 19.03.2 -ProviderName DockerProvider -Confirm:$false -Force   You see, we refer to a specific version of Docker. We noticed not all versions of Docker are stable – this one is, and we always try to test a certain version (with the option to roll back), in stead of just applying all new updates automatically. For a build agent, we just need a working docker, not an up-to-date Docker ;-). Step 5 – The funky part: remove the “Containers” feature What? Are you serious? Well .. Yes. Now, remove the Containers feature with this script and – very important – restart the server again! Uninstall-WindowsFeature Containers Restart-Computer -Force:$true -Confirm:$false Step 6 – Re-install the “Containers” feature With a very simpilar script: Install-WindowsFeature Containers Restart-Computer -Force:$true -Confirm:$false I can’t explain why these last two steps are necessary – but it seems the installation of Docker messes up something in the Containers-feature, that – in some cases – needs to be restored.. . Again, don’t forget to restart your server! Step 7 – Disable Windows Updates As Windows updates can terribly mess up the stability of your Build Agent, I always advice to disable them. When we want to apply windows updates, what we do, is just execute the entire process described above again! Yes ineed .. again! That’s it! If you ask yourself – is all this still necessary when we moved to the new way to work with Docker: when we build our own images and such. Well – I don’t know, but one thing I do know: the problems we have had to solve were not all related to the Business Central Images – but some just also regarding just “Docker” and the way Docker was talking to Windows .. (or so we assumed). So I guess it can’t hurt to try to find a way to setup your build servers that way that you know it’s just going to work right away.. . And that’s all what I tried to do here ;-).

Forum Post: .al file preview in Windows Explorer preview pane

$
0
0
Don’t like to wait for Visual Studio Code to boot up when you’re looking for a specific .al file? With this trick, you can preview .al files in Windows 10 without opening. . .. To view more visit : https://mrinmax.blogspot.com/2020/02/al-file-preview-in-windows-explorer.html

Forum Post: .al file preview in Windows Explorer preview pane

$
0
0
Don’t like to wait for Visual Studio Code to boot up when you’re looking for a specific .al file? With this trick, you can preview .al files in Windows 10 without opening. . .. To view more visit : https://mrinmax.blogspot.com/2020/02/al-file-preview-in-windows-explorer.html

Blog Post: Prism for AL Preview

$
0
0
You might know that I have been (and still am) a fan of Statical Prism . But now that we moved into the AL world this is becoming obsolete as it only handles C/AL code. Christin Clausen and his stati-cal team have been working hard on the AL version of the tool. Last week they released a preview version of Prism for AL to the community and they asked me to review it. Unfortunately I did not find enough time yet to that, but had a quick look and thought it very promising. And as such worthwhile to bring it to the attention of the community, so that others might also review it and give the stati-cal team as much as possible feedback on what they did so far. Note that it's a preview that's not yet covering everything as the Known preview limitationslist also let us know. Be invited to review it and make it as worthwhile as Statical Prism.
Viewing all 11285 articles
Browse latest View live