Hi Eric, thank you very much for your reply. The EDI solution/module is Lanham. It's all empty however. I'll contact them about from where to start. Thanks
↧
Forum Post: RE: NAV EDI tables
↧
Blog Post: Dynamics 365 Business Central and unpublished extensions during an upgrade
In these days I’ve received some messages from partners (and I saw on forums opened questions too) that signal that, during the upgrade to 15.4 release of a SaaS tenant, some custom extensions deployed on sandboxes are unpublished. Is this a problem or a bug in the Microsoft’s upgrade process for the SaaS tenant? NO! This is not a bug! As I said many times on this blog (just search for the word “ sandbox ” here), the sandbox environment is something different compared to the production environment. A sandbox environment is an environment created for developing or testing or training. This is not a real clone of the production environment under the hoods. Said that, as you know, on a production environment you can deploy per-tenant extensions only via the Extension Management page, while on a sandbox environment you can also deploy directly by using Visual Studio Code. And this is the cause of the unpublishing of extensions. As I wrote in the past, when you publish an app to to a sandbox environment from Visual Studio Code (per-tenant extension or PTE), this app is published with a local scope (scope of the service node that hosts the sandbox environment). An app published on a production environment is instead published with a global scope. What happens when Microsoft upgrades your sandbox from Dynamics 365 Business Central version X to version Y? This is not a common platform upgrade, but the entire environment is moved on another node where the new version Y is running. For this, all your installed PTEs are unpublished and you need to publish them again on the newly upgraded environment (obviously, you don’t loose data if you maintain the same extension’s ID). This is exactly what happened to many partners in the opened ticket I’ve received. So, be aware of this and act accordingly. If your customer is actively working on a sandbox environment (I see many customers that are using the sandbox for testing features, creating financials simulations, training employees) remember also that a sandbox environment has not an automatic backup policy in place by Microsoft (it’s not backed up with your production tenant). If you have customer’s sensitive data in a sandbox because your customer is doing simulations or trainings, you can evaluate if it’s better for you to create a secondary production environment instead of using a sandbox.
↧
↧
Forum Post: Debugging BaseApp in sandbox
Hi everyone, Hope you are all healthy and safe. I am trying an standard functionality into BC 15.4 on a sandbox, and I'm getting an error. I'm deploying wiht F5 an extension, hoping that when the error appears, the execution will stop at least in a .dal file, but nothing. So my question is... Is it possible to debug the baseApp into a sandbox? Thank you all!!
↧
Forum Post: RE: Debugging BaseApp in sandbox
For the debugger to stop on an error, you need to set the 'breakOnError' property in your launch.json to 'true'
↧
Forum Post: RE: Debugging BaseApp in sandbox
Hello Daniel, FIrst of all, thanks for your answer. BreakOnError property was already set to true, but nothing....
↧
↧
Forum Post: RE: Debugging BaseApp in sandbox
Please have a look at this link: - https://community.dynamics.com/business/f/dynamics-365-business-central-forum/374917/is-debugging-in-the-standard-table-not-possible
↧
Forum Post: RE: Debugging BaseApp in sandbox
Thansk for your answer. So basicaly, there is no way to debug baseApp in a sandbox. The only trick is to create a function subscribed to the nearest event in the base functionality we want to debug, and put the breakpoint on it. AM I right??
↧
Comment on How to become a great Microsoft Dynamics NAV developer: Tables and other objects
Is there really any programming background required to get Coursework writing services ? I found it very interesting as where I am working we use it every day.
↧
Forum Post: RE: Debugging BaseApp in sandbox
Standard BC v16 sandbox container, standard AL: Go! workspace (which includes a launch.json with the BreakOnError property turned on), hit F5 to deploy it to my container with the debugger. Created a new sales order and entered a silly value into the Customer number, and it breaks right where I expected, in the Customer.dal object file: I don't know what you are doing that you're not telling us, but it is supposed to break on error, and open the object where the error happens This is probably not helping you solve the issue, but at least this shows what is supposed to happen
↧
↧
Forum Post: RE: Debugging BaseApp in sandbox
Hello Daniel, I am using sandbox15.4. I am trying to create a purchase order from the sales order. As you can see, I have the breakOnError property turned on, and the debugger is running...
↧
Forum Post: RE: Debugging BaseApp in sandbox
As far as I know, yep, that's right. Thanks,
↧
Blog Post: Test Automation Examples - a GitHub repository
Over almost a decade I have been evangelizing test automation by blogging , presenting at various conferences and webinars, workshops, and, eventually last year, writing a book . People are picking up, but it's clear there is still "a war to be won". Testing, more specifically test automation, is too often considered a cost center instead of an essential part of our daily development practice. In my joined article with Global Mediator, called From a testing mindset to a Quality Assurance-based mindset , we shared with you a number of thoughts on this (and we owe you a number of follow-up posts on that). As research has shown, pushing defect finding up-stream will lower the cost of your product/project and thus the satisfaction of your customers. Given that the consequence is simple, rephrasing the above: testing is not a cost center, but an essential part of your daily development practice, just a app coding is. Now let's see if ... A next step in Luc's gospel on test automation ... can help some more to take the threshold. For this I have started a new GitHub repository which I baptized Test Automation Examples . Follow the link to have a look there. The Test Automation Examples repository will become over time a collection of examples used in my test automation classes/workshops. Small apps that have been built together with its counter-part test code. Easy enough to understand the app, relevant enough to get useful examples of test code. Each example will have an efficient scope description that will be accompanied with a flowchart. Each app will be defined by a coherent and complete set of ATDD scenarios in ATDD.TestScriptor PowerShell format allowing them to be converted into code. I will do my best to provide both AL and CAL app and test code. As part of my first Online Crash Course Test Automation I have uploaded the first example, called Blocking Deletion of Warehouse Shipment Lines , to the repository. Be welcome to make use of this example, and any next one, and even more: be invited to contribute by improvement suggestions or even proposing new examples. Let's challenge each other to the next level of development practice and get us all on test automation! BTW : this week the 2nd Online Crash Course will start. There are still seats available.
↧
Forum Post: Change Promoted Action Categories position of "Report"
Hi and happy Easter ;) Is it possible to change the position of the first three prometed action categories? Or are they deifined by default/not changeable? https://docs.microsoft.com/en-us/dynamics365/business-central/dev-itpro/developer/devenv-promoted-actions Thank you!
↧
↧
Comment on Welcome to the Microsoft Dynamics C5 2014 users and professionals
its nice dialogue on the topic of this post at this place at this website, I have read all that, so now me also commenting here. Leptitox Review
↧
Comment on Welcome to the Microsoft Dynamics C5 2014 users and professionals
Just want to say your article is as surprising. The clarity to your post is just great and i can assume you are knowledgeable on this subject. Resurge Review
↧
Comment on Test Automation Examples - a GitHub repository
Excited to see this grow and become a repository for everybody to learn and develop their testing skills. We must start looking at testing as Quality Assurance and not a drain on resources; NOT using automated testing is a constant drain on resources that is hidden in many other areas of our business like Support (first, second and third line).
↧
Blog Post: Azure SQL Serverless tier: a way to save cost for your workloads in the cloud
When going to the cloud, saving costs while maximizing performances is always a goal to reach but sometimes also not so easy to achieve. I’ve talked a lot in the past about how to move your databases to the cloud, how to use Azure SQL Database for Dynamics 365 Business Central , how to optimize performances and also (if cost is not a problem and you want the maximum performances for your database in general and for Dynamics 365 Business Central on a IaaS architecture) how Azure SQL Hyperscale could have a lots of performance advantages over Azure SQL (but you need to handle scaling). Today I want to talk a bit about a quite new service tier for Azure SQL databases: the serverless tier. Azure SQL Database serverless is a compute tier for single databases that automatically scales compute based on workload demand and bills for the amount of compute used per second. The serverless compute tier also automatically pauses databases during inactive periods when only storage is billed and automatically resumes databases when activity returns. Azure SQL Database serverless is not for all scenarios, but if you have a database that is not always heavily used and if you have periods of complete inactivity, this is a very interesting solution that can you guarantee performances and that can help you on saving a lot of costs. A tipical scenario is the classic database usage that I think you have in a Dynamics 365 Business Central installation: the database is heavily used from Monday to Tuesday, maybe 8 hours a day, then it’s not used in the night and it’s not used during the weekends. In this situation, you have 5 days * 8 hours/day of usage. And during the working days, I think that you have some “up and down” of database usage. This is where Azure SQL Serverless could help you: it can automatically scale your DTUs up and down accordingly to the database usage and it could pause the database when it’s not in use. You pay only for what you’re effectively using. Creating an Azure SQL Serverless database is extremely simple and it can be done via the Azure Portal or directly by using scripts (Powershell or Azure CLI). To create an Azure SQL Serverless database via the Azure Portal, you need to select the General Purpose model and then select the Serverless option: What you’ll pay for this instance will depend on how much memory and how many processors your database uses, as well as how much storage the database requires. You can set the minimum and maximum number of cores (from 0.5 to 16) and the available memory (linked to vCores, from 3GB minimum to 48 GB). Regarding storage, cost is about $0.12 a month for every GB, while computing cost is about $0.27 per vCore per hour. This means that if you have a 50GB database that uses 2 cores for 8 hours a day, for a month it costs about 6$ for the storage + (0,27*2)*8 * 20 working days = 92$. Storage is cheap, as usual. You’ll pay about $0.12 a month for every GB. Compute is just under $0.27 per vCore per hour. If your database uses 1 processor for an entire day, that’s under $7 for the day. However, if the database is inactive you can pause the database and pay nothing for the compute power. As an example, here I’m creating an Azure SQ L Serverless database that can span from 0.5 to 6 vCores with a maximum data storage capacity of 20 GB. For this database, I’ve selected to enable the auto-pause feature: As said before, the database can be paused manually, programmatically or automatically by enabling the auto-pause feature (as you can see in the previous picture). When auto-pause is enabled, Azure will pause the database if it’s not detecting activities on the database for the number of hours specified. The minimum number is 1 hour. The above picture shows also the estimated price. As you can see, you will pay about 2.52 euro/month for the storage and about 0.000122 euro for compute cost (vCore per second). As a comparison, if you select a Provisioned instance (where compute resources are pre-allocated and you’re billed per hour based on the vCores configured) with 4vCores and 20 Gb of maximum data size, the cost is about 385 euro per month. I’ve done a set of stress tests using SQLQueryStress tool on this database (by simulating some tipical ERP queries) and performances and scalability is good. The first test simulates 50 users performing some select/insert queries (with lots of joins) for 100 iterations each: The second part of the test simulates 200 users performing the same set of queries (1000 iterations each): As you can see from the above diagrams, in Serverless databases you have also the App CPU billed and APP CPU percentage metrics. The APP CPU percentage metric is in my opinion quite interesting to check because it gives you the % of the vCore allocation (100% means that you’re database vCore allocation is under-estimated) and you can tune up your database scaling accordingly. If you see (as in my case) that 6 vCores are too much (you’ll never reach this limit) you can change the vCore allocation for your Serverless database. From the above image, you can also see that App CPU billed metric shows the amount of compute billed (measured in units of vCore per seconds). In this case, to execute this test I have a 2,69K vcore per seconds usage. The compute unit price (as you can see in the second image of this post) is 0,000122 euro so the estimated billing is about 2,69*1000*0,000122 = 0,33 euro. Obviously, this is not a featured complete test, but I think that it can give you an idea that the Serverless option can handle scalability and performances in a great way while maximizing cost savings. The database engine is able to handle scaling dinamically (up and down) accordingly to the database usage and you always pay for how much the engine is working. Azure SQL Serverless is a great choice on lots of scenarios, but there are some aspects that you should consider and that could affect your choice: When the database is paused, the resume is not immediate but you’ve to wait some seconds (on my personal experience, at least about 40 seconds). During this resume phase the database is not accessible. When your database is not used, the cache is cleaned (so you can have some small performance degradations after a resume). The compute unit price is higher for a serverless database than for a provisioned compute database (Azure SQL serverless is optimized for workloads with intermittent usage patterns). If CPU or memory usage is high for a long period of time, then the provisioned compute tier may be less expensive. Obviously, if you disable the AutoPause feature your database will always be online and you pay a bit also if your database is not in use. Instead, when the database is paused, only the storage is charged. Remember also that the AutoPause feature may not happen if: The are active sessions in your database The CPU usage for user’s workloads is > 0 Long term backup is is progress Active geo-replication is enabled Is it a cloud SQL database solution for everyone? No. Is it the recommended solution for Dynamics 365 Business Central with Azure SQL as backend? No. But if you have a particular usage patterns where you have peaks of loads (like working hours) and other moments where you’re database is not used (like at night) this database type can help you on saving money. When instead you have a more regular, predictable usage and higher average compute utilization over time, a provisioned instance can be better for you. Before deciding the best database type for your application, you should evaluate your usage patterns a lot. Remember that you can switch the database from provisioned to serverless also after creation. As a comparison, here a brief summary of the main differences between a provisioned and a serveress instance:
↧
↧
Forum Post: RE: Change Promoted Action Categories position of "Report"
Answer to myself: No it is not possible to change it. I made a workaround. I didn´t use the report category in the code. Instead i created a new category "Report as category 5 to use it on this position. he original Report is not being shown anymore, because i nvere use it on this page.
↧
Blog Post: Business Central Spring 2019 Update (BC14) CU11 TearDown
General BC 14 Application version 14.12.41935 is a small update. The size of the changelog is 378KB which suggests there is not much changes included. There was no new objects, and only a couple of new fields. Most of the changes included are visibility changes on different pages and fields, and reports have only minor modifications that keep the changelog quite small. Security fixes The introduction for this CU points out that this CU fixes following two security issues: https://portal.msrc.microsoft.com/en-us/security-guidance/advisory/CVE-2020-1018 https://portal.msrc.microsoft.com/en-us/security-guidance/advisory/CVE-2020-1022 The first one fixes visibility of a masked field (for example password) in certain situations, and the second one is addressed against a vulnerability that could lead an authenticated attacker to run arbitrary code on victim's server. This update also includes security fix introduced in CU10 with identical description as the CVE-2020-1022. Database conversion The installation of the platform update does not require database conversion. If you are upgrading an existing installation, please follow the instructions from MSDN New objects This time there was no new objects, and mainly the changes are concentrated on different pages. New fields and other interesting functionality Table 18 Customer and T79 Company Information have been added a new boolean field "Use GLN in Electronic Document". This is obviously used in the electronic document exchange framework, and is used to replace VAT Code in electronic document exhange. Codeunit 2, function InitElectronicFormats support for Peppol versions 2.0 and 2.1 has been removed. The only remaining electronic formats left are SEPA and Peppol BIS3 formats. Peppol BIS3 support has been extended to contain also Sales Validation and Service Validation. Posting of documents (for example Cu80 Sales-Post and Cu 5980 Service-Post) have been added functionality to check default electronic document fields with Report Distribution Management codeunit 452. This codeunit is worth reading through if you are into electronic document exchange. I am :) Microsoft has continued "Clean the Clutter" initiative in different pages, and there has been added some variables like "DateAndTimeFieldVisible", "DimVisible1", "DimVisible2" and so on, this time especially on Manufacturing pages. Thank you for reading, please leave a comment!
↧
Forum Post: RE: Debugging BaseApp in sandbox
https://www.google.com/amp/s/demiliani.com/2019/10/25/dynamics-365-business-central-debugging-the-base-application/amp/ I came across this today and I could debug base code. Thanks,
↧