NAVTechDays is already over for a while .. and yes, I already blogged about it . But I recently had to refer to a part of my session on “Development Methodologies” , and I noticed that someone named “Marcus Nordlund” actually put quite some time to completely “menutize” the video in the comment section of the video :-). Awesome effort that I needed to share! Thanks, Marcus! Here it is: Future 3:08 Development methodologies 3:52 Don’tNet 4:04 Embracing Depencies 4:43 Strict Guidelines 5:26 No Test, no approval 6:01 Extendable patterns 7:13 Avoid Breaking changes 8:23 When i’ts hard in al, it doesent belong i AL 9:02 Embracing Depencies when moving to AL 9:27 Moving to AL Migrate to Code Customized AL 10:03 Migrate to a Monolith extension 12:18 Rebuild monolith from scratch 12:34 Rebuild a set of apps with dependencies 12:53 Dependencies 13:16 Dependencies – Why? 13:52 Dependencies – How? 15:04 Dependencies – Schema example 15:16 Code Example of schema 15:59 Dependencies – Real schema example 19:27 Please Think things through 20:13 How do I think things through when moving to AL 20:53 Dependecny Analysis based on c/al 21:07 waldo.Model.Tools 22:13 Flow 25:15 Get All objects from C/AL & Talg all 25:34 Manually correcting/Ignoring modules 30:00 Get Where Used per object 30:14 Analyze dependencies per module 32:33 What is a circular dependency? 32:27 Solve circular dependecies 35:57 Change design 37:15 Manually create App-layer 39:27 Analyze dependencies for app 39:48 Solve Circual dependencies again 40:27 Tools we used are availabile for you 41:13 Dependecy-stuff a bit further 41:41 How to architect? 45:54 To be albe to set up demo environments 49:17 How it looks in the app 50:29 How it looks in code 52:09 Rest 57:02 Don’tNet 58:25 Some other more hands-on develoment methodologies 59:09 Where do we put business logic 59:18 Method codeunit 59:58 Decouple method from another extension 1:02:59 One App, One Repository 1:06:00 Test Driven Development 1:08:11 Workspace with different repos 1:09:02 Integration tests 1:10:14 How to create integration test 1:11:25 Automated Testing in Microsoft Dynamics 365 BC 1:13:22 Breaking Changes 1:13:56 Number Series 1:15:12 Translations 1:16:48 Summary Development methodologies 1:18:08 Questions 1:18:20 Move code from tables to codeunit 1:18:55 Test Codunit – test XML ports 1:19:40 Tips for connection between BC and Azure DevOps 1:20:38 Where do you do the builds of apps? 1:22:26 Multiple apps, extend a table will be many separate tables 1:23:29 Upgrading testdriven development when converting to apps 1:25:02 When has it been to hard to do it in AL 1:25:44 Here is the complete video: Sidenote: From evaluations this session was evaluated as “Best Session” and me as “Best Speaker” of the conference – something I’m really proud of given the awesome content and speakers every single year :-).
↧
Blog Post: NAV TechDays 2019 – Development Methodologies for the future – The Menu
↧
Forum Post: Navision Tables Vs SQL 2008 R2 Tables
Hallo, I dropped table A in the SQL database. When i now attempt to delete Table A from Nav object designer, i get the below errors. How do i force delete the table from Navision? It is within my licence range
↧
↧
Forum Post: RE: Navision Tables Vs SQL 2008 R2 Tables
You should never delete a Navision table from SQL. You should always delete objects from Navision, it will automatically delete from SQL Server Database.
↧
Blog Post: Still wondering, what's the hype over extensions for Business Central?
Then join me at the two-day course at SuperUsers in Hillerød. This training leads individuals through a simulated implementation project, where the goal is to customize Dynamics NAV 2018/Business Central to a meet customer’s requirement. At the course we will create a full solution for Dynamics NAV 2018/Business Central creating new objects and extending existing objects. The solution will be created purely as an extension in Visual Studio Code, without changing a single object in the standard application of Dynamics NAV 2018/Business Central. Prerequisites You must know the Dynamics NAV Integrated Development Environment You must master programming in the C/AL language You must understand the concept of programming with events You must understand the concept of extensions Read more about the course here: https://www.superusers.dk/kursus/dy0440/ Have you downloaded your free preview of the books: Introduction to the Modern Business Central client 200 Question (The most important things to think about when considering Business Central) Get them here:
↧
Forum Post: Business Central wave 1 VS wave 2
Dears , Greetings of the day . My company works on upgrading our Dynamics LS NAV 2016 to Business central , but we are worry a little bit regarding which wave we should take for upgrade ( LS-BC 14 Wave1 or LS-BC 15 wave 2) . LS-BC Wave 1 LS-BC Wave 2 C/AL + AL AL Only RTC + Web client Only web client Development Environment No development Environment SQL 16 SQL 16 These are the main differences i know about the both waves . and i still don't know which wave we should select . We are in hospitality business , we are using LS and NAV . any help will be much appreciated . Thanks and Regards
↧
↧
Forum Post: LS+BC wave 1 VS LS+BC Wave 2
Dears , Greetings of the day . My company works on upgrading our Dynamics LS NAV 2016 to Business central , but we are worry a little bit regarding which wave we should take for upgrade ( LS-BC 14 Wave1 or LS-BC 15 wave 2) . LS-BC Wave 1 LS-BC Wave 2 C/AL + AL AL Only RTC + Web client Only web client Development Environment No development Environment SQL 16 SQL 16 These are the main differences i know about the both waves . and i still don't know which wave we should select . We are in hospitality business , we are using LS and NAV . any help will be much appreciated . Thanks and Regards
↧
Forum Post: RE: LS+BC wave 1 VS LS+BC Wave 2
Based on the information that I listened to in the upgrade Video from the TechDay (mibuso channel) you need to define the upgrade from the "old" version of NAV for NAV BC 14.0 and the do upgrade to NAV BC 15.0. About LS Extension - you need to receive consultancy from LS (because they only know what is ready for their functionality).
↧
Forum Post: RE: Date Filtering On Report
Set debugger on the " TaxHistory.SETFILTER( "Posting Date",'%1..%2',... " line and see that data the StartDate and EndDate and why filters do not work.. It is the way only to receive a real answer without guessing.
↧
Blog Post: Dynamics 365 Business Central Wave 1 2020: what I love and what I want
As you already know, Microsoft last week published the 2020 release wave 1 plans for Dynamics 365 and Microsoft Power Platform document. This is the document that announces the roadmap for the next major releases of these products. Regarding Microsoft Dynamics 365 Business Central, there are lots of interesting new features announced (or planned) and the things that I think will be absolutely wonderful are: API for continuous delivery of the AppSource apps via Azure DevOps services : this will permit to perform a phased rollout of an app to your customers directly from an Azure DevOps release pipeline. This feature will not be available immediately for all partners. Improved set of migration tools : you will find some improved features for data migration betweem on-premise and online version and from v14 on-premise to v16 SaaS release. Printing management and barcodes : one of the biggest lack today is direct printing on SaaS. The Team is a tively working on this and in the next major you’ll find some new events and features for handling cloud printing. I think that more should be done on this aspect. AL interfaces and many other language-related features : Waldo and Tobias has explained all well. Improving the AL language is a must and personally I hope to see always more object oriented feature. For the on-premise world we’ll have also a way to modify the app.json file (publisher and so on) of the base application without breaking dependencies of standard apps. Telemetry in Application Insights : This is a feature that I love (I’ve talked about it here ) and I think it will be extremely important always more in the future to help partners on managing SaaS tenants. As a first step we’ll see detail about long running queries and sign-in issues, but I hope for big improvements in the near future in order to have a full telemetry (errors, warnings, custom telemetry injections and so on). Read scale-out : this is a new feature not too much published, but I think it’s extremely interesting. Business Central artifacts (Reports, API Pages, and Queries) now can get access to a read-only replica of the database. The Page, Report and Query objects have a new property called “ DataAccessIntent ” that can take values ReadOnly or ReadWrite . This property works as a hint for the server, which will connect to the secondary replica if possible. When a workload is executed against the replica, insert/delete/modify operations are not possible, so a new validation is introduced for ReadOnly objects. Any of these operations will throw an exception at runtime (new compile-time validation will be added in the future). Integration with CDS : this will be a first step on integrating Dynamics 365 Business Central with CDS (don’t expect to have your ERP running on CDS), but it’s a foundamental building block for integrating Dynamics 365 Business Central with the full Dynamics 365 product suite. Please push this feature always more! There are also lots of other features (actually listed only as “ideas”) that I think should have an higher priority level in the list of “things to do” for the near future, like: On-demand backup and restore of a tenant database ( click here for support ). Create a production environment from another production environment ( click here for support ). Production support for Docker containers ( click here for support ). Report extension (possibility to extend the report’s dataset without duplicating the entire object) ( click here for support ). Extending keys of standard tables and creating composite keys on standard tables ( click here for support) . Supporting the debugger experience on a production tenant : I know that we cannot directly attach a debugging session on a production tenant (expecially if the cloud app is spanned on multiple servers) but could be interesting to have the possibility to download a debugging session file (like the .diagsession file available on native Azure web apps inside Application Insights) for fully debugging offline. Improving Page Background Tasks for supporting write transactions : I would like to have the possibility to start a background tasks also for modifying records or launching long-running tasks. Only read-only transactions is not enough. Improving the CRM integration . I’m sure that Microsoft is listening…
↧
↧
Forum Post: 365 BC On-Prem - Save RDLC report as HTML
Hi Guys, I have a client who want's their employee's payslips presented to the employees online, in their employee portal. So my task is to deliver an HTML file to their web-guys. SAVEASHTML is only valid for reports with Word layout, so this is not an option. I tried to work around it with the use of SAVEAS, to an outstream created from a file on the target share. FileVar.CREATE(SaveFileName); FileVar.CREATEOUTSTREAM(OutStr); RecRef.GETTABLE(Employee); PaySlipNewModel.SAVEAS('',REPORTFORMAT::Html,OutStr,RecRef); FileVar.CLOSE; However this causes the client to crash, and on the servicetier i get an error in the eventlog, about the value for parameter Format is out of range. So this was not an option to work around it. Do i really have to recreate the layout in Word, or does someone have a solution on how i can get the output of a RDLC report saved as HTML? TIA // Alexander
↧
Blog Post: Dynamics 365 Business Central and data compression
Yesterday Kennie Pontoppidan (from the Dynamics 365 Business Central Team) asked us an interesting question: do you use data compression on your Dynamics 365 Business Central on-premise database? If yes, what type of compression? From the answers I can see that there are partners that uses data compression widely, partners that uses data compression only on certain tables and partners that are not using data compression at all. Who is the winner? It’s quite difficult to say I think… What is data compression? SQL data compression is a technology that’s been around since SQL Server 2008 Enterprise Edition. Starting from SQL Server 2016 SP1 data compression is available on ALL the editions. There are two possible types of data compression on SQL Server: Row-level compression : it works by converting any fixed length data types into variable length types (for example a char(150) field that doesn’t cover all the 150 characters is compressed to the real length). Page-level compression : this is more advanced level of compression. By default, page compression also implements the row level compression ad first, then it adds two other types of compression (Prefix compression and Dictionary compression on columns). As of Business Central April 2019, the use of SQL Server data compression is a natively supported configuration. You can use data compression to help reducing the size of selected tables in your Dynamics 365 Business Central database. In addition to saving space, data compression can help improve performance of I/O intensive workloads because the data is stored in fewer pages and queries have to read fewer pages from disk. This is especially useful if your storage system is based on standard disks and not SSD disks. With AL language, when you define a table you can use the new CompressionType property to define data compression: Saying that, what’s my opinion? I normally suggest to always use data compression on your Dynamics 365 Business Central on-premise database . My favourite rule (absolutely a personal rule) for Dynamics 365 Business Central is to use Page-level compression on the Ledger Tables, Posted Sales and Purchase documents and on archive tables . Normally, I suggest to use the sp_estimate_data_compression_savings stored procedure to decide which type of data compression to apply on a table (this advice comes from a suggestion given to me by Jörg Stryk years ago). If the difference between row-level and page-level compression is under 5%, use row-level compression otherwise use page-level compression. As an example, this is the query I use for choosing a data compression strategy for an heavy table: EXEC sp_estimate_data_compression_savings @schema_name = 'dbo' ,@object_name = 'Cronus$Item Ledger Entry' ,@index_id = 1 ,@partition_number = NULL ,@data_compression = 'PAGE' GO The above query estimates the data compression by applying page-level compression to the Item Ledger Entry table. This is what happens on my database if I test page-level and row-level compressions on this table: As you can see, with page-level compression, my table from 283360 Kb is reduced to 54152 KB. With row-level compression: the table size from 283360 Kb is reduced to 123600 KB. As you can see, for this table is much better to use page-level compression. If you want to save space and increase performances on big tables, I suggest to use data compression on your heavy tables. But what are the connected “problems” on using data compression? I think that you need always to remember that when you use SQL data compression, the data is also stored in RAM in a compressed state. When the data is retrieved by the client it’s uncompressed and this can cause an increase load on your CPU . If you use data compression, you need to be sure that your SQL Server CPU has the power to satisfy your needs (please check that your CPU is never above 80%). But normally I think that you don’t under-estimate your CPU power isn’t it?
↧
Forum Post: TableRelation issue
Hi Group, I've create a Table + Page Extension where 1 of the fields should have a table relation to another Table I've Created. My Tables is called "Incident Error Type", it has - 1 field of type Code[20] as its code. - 1 field of text Text[50] for its description - 4 fields of type boolean that allows the user to define what type of record it is. The 4 possibilities are "Internal Error", "Customer Error", "Vendor Error" and "Employee Error" In the table extension, I have added 1 field called error type to which I've define the following TableRelation for testing purposes... TableRelation = "Incident Error Type"."Error Type Code" where ("Vendor Error" = filter (= true)); I would ultimately like to have an if else clause for which the where clause would change depending on the value of another field but I cannot get this one to work properly. Even though I have the where clause set to filter on Vendor error being true, the table displays all the results in the lookup screen. But if I select a value that is NOT a Vendor Error, it returns an error that the value cannot be found in the related table. Any help would be appreciated. Thanks
↧
Forum Post: How to Enable Page Inspector
I have an onpremise Business Central 15.2 (DK) installation, but whereas the page inspector works fine in our developers containers and our on-prem development environment, then it doesn't show up in our production environment. Does anyone know of a setting or permission we may be missing here?
↧
↧
Forum Post: Hide the edit / read-only button
I think I'm doing this wrong but here is my scenario / question... I created a new page where the employee should create an Incident Report. When the Incident is first created, the status is set to true (OPEN). When the status gets changed to false (closed), I want the page to become non-editable. Unfortunately, I have a conflict happening with the edit button at the top of the page. When the status is set to OPEN, each group's editable state is set to TRUE. When I change the status to CLOSED, I programmatically change each group's editable state is set to FALSE. These 2 scenario work great when the page editable status is set to TRUE as all my fields toggle between editable and non-editable. When the page is set to read-only, I cannot change it in code... the following does not seam to work on either PageLoad or OnAfterGetRecord Triggers CurrPage.Editable(Status); Hope someone has a simple answer... thanks I'm currently on BC13.
↧
Forum Post: Need a suggestion
Hi All, I got a requirement to flow 'Vendor Item No' on Warehouse Shipment and Receipt Line and Posted Warehouse tables also. So I have two options either FlowField or through the posting routine. I tried both and working fine in both the cases. But now I want to know which is best way to do and why? Working in Business Central cloud. Kindly suggest. Thanks in Advance.
↧
Forum Post: SQL Error message: when running a report from a role center
Hi Team, The following sql error was unexpected: This sqltransaction has completed; it is no longer usable connection still open from NAV, running report Thanks
↧
Forum Post: RE: How to Enable Page Inspector
Well I figured it out myself. We need to set the D365 Troubleshoot permission set. ref: https://docs.microsoft.com/en-us/dynamics365/business-central/across-inspect-page
↧
↧
Forum Post: RE: Need a suggestion
Hi, I would use a real field and have it flow through the posting functions. If you were using a flowfield, and the vendor changed their item number, then instead of showing the correct (at time of posting) item no. shipped, it would show you the new vendor item number. As a field it would stay correct. But it really depends on the purpose of this requirement and the nature of the items.
↧
Forum Post: RE: SQL Error message: when running a report from a role center
Hi, You have been a member for almost 4 years and posted 100's of questions already. And still you continue to ask questions like this! More information please, if you like to get an answer.
↧
Forum Post: RE: Non-inventory to inventory
It's the same issue you have if you need to change costing method or start to use item tracking for an existing item. Here the only "trick" I know would be to rename to "old item" and create the item again.
↧