Installing an extra language module in Business Central up till version 14 is fairly easy. First you download the new module from: https://www.microsoft.com/en-us/download/details.aspx?id=100124 Then you install the module in database via the Development environment and on the different clients as described here: https://docs.microsoft.com/en-us/dynamics-nav/how-to--install-language-modules So, that’s easy This week a partner asked me how to install a new language in Business Central from the 2019 Release Wave 2 (15), and that is actually a lot easier. The answer is to make an extension with the desired language layer and then install the Extension. So, let’s try to do that by installing a Swedish language layer in my own Danish database. I will start by downloading the Swedish installation pack: Then I unpack the installation file and the Base Application.Source.zip file Then I copy the Base Application.sv-SE.xlf file to my desktop. Now I make a new AL project in Visual Studio Code: In the app.json file, I add an extra property features and set the values TranslationFile and GenerateCaptions . Then I build the solution with Ctrl+Shift+B . This will add the translation folder in the project: Right now, the only values in the xlif file are the captions from the HelloWorks.al file. So, now I delete the HelloWorld.al file and copy the Swedish xlf file from my desktop to the Translations folder, add some information to the app.json file, so that it looks like this: I publish the extension to the server and wait for the Modern client to start up. Then I can check the Extension Management to see if the extension is installed: And to test it, I can switch to the Swedish language: And now the Modern client speaks Swedish To uninstall, just go to extension management and uninstall the extension again. Instead of downloading the whole package from Partner Source it might be faster to spin up a Docker image with the Swedish version. Then you can copy the xlf file from there. I have put the project on GitHub: https://github.com/peikba/SwedishLanguageModule_15.4
↧
Blog Post: How to add language layers in Business Central version 15
↧
Blog Post: Live sharing your AL code directly in the browser
In this terrible period where COVID-19 is changing our days (and our life) and where homeworking is becoming a must, I think it could be helpful for some of you to know that Microsoft has opened the preview of a nice Visual Studio Live Share feature. I think that many of you knows what is Visual Studio Live Share , if not please read this post I’we written two years ago. With Visual Studio Live Share , when you’re writing on your project in Visual Studio Code (but it’s the same for Visual Studio ) you can start a screen sharing session with a contact: Your invited contact will receive a link for joining your shared code session. When it opens the link, it has the choice to open the session directly in Visual Studio Code (or Visual Studio ), as explained in my old post. But starting from today, you have also a new option: You can now open the Live Share session directly from a browser, without the need to have Visual Studio Code (or Visual Studio ) installed on your machine. In this way you can join the LIve Share session from everywhere and from every device. When you join the session, you can see the shared code and start collaborating, all from your favourite browser: The shared session can be read-only or with full collaborative permissions (the invited people can modify your code directly). Remember that this feature is still in preview today (I’ve found sometimes problems on authentication during a fully collaborative session) but it’s a nice fetaure that in this period could help many of you.
↧
↧
Forum Post: Created Invoice and auto Posted Invoiced
Currently, I am importing the records to a table then creating Invoice and automatically posting. I have successfully created the Invoice but cannot post automatically. Please give me a solution. Thanks so much TempImport Invoice - OnAfterGetRecord() vCount := "TempImport Invoice".COUNT; IF "Created Invoice" = FALSE THEN BEGIN IF (LastDocument <> "Document No.") THEN BEGIN //Insert Purchase Invoice PurchHeader.INIT; PurchHeader."Document Type" := PurchHeader."Document Type"::Invoice; DocNo := lCdu_NoSeries.GetNextNo('P-INV+',TODAY,TRUE); PurchHeader."No." := DocNo; PurchHeader.INSERT(TRUE); PurchHeader.VALIDATE("Buy-from Vendor No.","Account No."); PurchHeader.VALIDATE("Order Date", WORKDATE); PurchHeader.VALIDATE("Responsibility Center","Responsibility Center"); PurchHeader."Vendor Invoice No.":="TempImport Invoice"."Document No."; PurchHeader.MODIFY; LineNo :=0; END; LastDocument:="TempImport Invoice"."Document No."; PurchLineRec.RESET; PurchLineRec.SETCURRENTKEY("Document No.","Line No."); PurchLineRec.SETRANGE("Document Type",PurchHeader."Document Type"); PurchLineRec.SETRANGE("Document No.",PurchHeader."No."); IF PurchLineRec.FINDLAST THEN LineNo := PurchLineRec."Line No." +10000 ELSE LineNo += 10000; PurchLine.INIT; //Create Purchase Line PurchLine.VALIDATE("Document Type", PurchLine."Document Type"::Invoice); PurchLine.VALIDATE("Document No.", PurchHeader."No."); PurchLine.VALIDATE("Line No.", LineNo); PurchLine.VALIDATE(Type,PurchLine.Type::"G/L Account" ); PurchLine.VALIDATE("No.","G/L Account"); // -> Check G/L PurchLine.VALIDATE(Quantity,1); PurchLine.VALIDATE("Direct Unit Cost","Amount"); PurchLine.VALIDATE("Dimension Set ID","Dimension Set ID"); PurchLine.INSERT(TRUE); PurchLine."Dimension Set ID":="Dimension Set ID"; PurchLine.MODIFY; END; //Process PPI Document // ReleasePurchDoc.PerformManualRelease(PurchHeader); // PostPurchaseInvoice(PurchHeader."No."); TempImport Invoice- OnPostDataItem() PurchaseHeader.SETRANGE("Vendor Invoice No.","History Dragon Payment"."Document No."); IF PurchaseHeader.FIND('-') THEN BEGIN REPEAT ReleasePurchDoc.PerformManualRelease(PurchHeader); PostPurchaseInvoice(PurchHeader."No."); UNTIL PurchaseHeader.NEXT=0; END; //Proc posted PostPurchaseInvoice(DocNo : Code[20]) : Boolean WITH PurchHeaderRec DO BEGIN PurchHeaderRec.SETRANGE("No.",DocNo); PurchHeaderRec.SETFILTER("Document Type",'%1',PurchHeaderRec."Document Type"::Invoice); IF PurchHeaderRec.FIND('-') THEN BEGIN // ReleasePurchDoc.PerformManualRelease(PurchHeaderRec); PurchPost.RUN(PurchHeaderRec); END; END;
↧
Forum Post: RE: Created Invoice and auto Posted Invoiced
Please give us a bit more to work with. Like what error do you get, how far does it come with you run it with the debugger turned on?
↧
Comment on Test Fixture Initializer
Hi Luc, Was just working on getting all the Microsoft tests running for our project, when I remembered you did this post a few weeks ago (feels like it's much longer - it was before Covid19). But thanks, will see how I can incorporate this into our test setup.
↧
↧
Forum Post: RE: Created Invoice and auto Posted Invoiced
Thank you reply. For example, in the import table, there are two documents. And after creating 2 Invoice (A, B), the program will only post Invoice last created.
↧
Blog Post: New Command in My CRS AL Language Extension: Search Object Names
Recently, I came across this post by Jack Mallender . An interesting idea on how to efficiently find AL Objects among your files. It basically comes down to using regex in combination with the global search functionality in VSCode, like (yep, I’m stealing this from Jack’s post – sorry, Jack ;-)): It immediately convinced me that this would be very useful for everyone, so I was thinking – why not making it part of “ waldo’s CRS AL Language Extension “? It didn’t seem too difficult for the experienced TypeScript developer – so for a noob like me, it should be do-able as well ;-). A few hours later – after a lot of googling – I found the 9 lines of code that made this easily possible .. I’m not joking ;-). So – I present to you – a new command as part of the extension: Search Object Names . Simply call the command, provide the searchstring, and look at the result in the search window: Now, I made it so that when you are on a word, or selected a word in the active editor, it’s going to take that word as the default Searchstring. Just imagen, you’d like to go to the source of the code of a variable you’re on, simply make sure your cursor is on the word, and invoke the command: Settings May be it’s a bit overdone, but yes, there is a setting as well (CRS.SearchObjectNamesRegexPattern) , because you might want to search differently than I do .. . We were discussing that on twitter, and I just decided to not decide for you on how you want to search, but let you set it up if you would like to search differently as me. Let me give you a few options on what would be interesting settings… Find the source object (default) Pattern : '^\w+ (\d* )?"*' Setting in VSCode : "CRS.SearchObjectNamesRegexPattern": "^\\w+ (\\d* )?\"*" // Mind the escape characters This is the default pattern, which means you don’t have to set anything up for this behaviour. Basically this pattern will search any occasion where it starts with a word, than optionally a number, and then your search string.. . In other words: the exact source object of the object name you’re searching for.. . Find all references Pattern : '\w+ (\d* )?"*' Setting in VSCode : "CRS.SearchObjectNamesRegexPattern": "\\w+ (\\d* )?\"*" // I just removed the "^" from the default setting, which indicates "search anywhere" This pattern will search any occasion in code – which means: also the variable declarations. Let’s say it’s an alternative “where used”. I won’t set it up like this as a default setting, but I might just change it ad hoc in the search by simply removing that character.. . Find anywhere in the name Pattern : '^\w+ (\d* )?"*(\w+ *)*' Setting in VSCode : "CRS.SearchObjectNamesRegexPattern": "^\\w+ (\\d* )?\"*(\\w+ *)*" // basically added that there could be multiple words before the searchstring. This pattern is somewhat more complicated, but if you would not rely on your search term being the beginning of the object, but rather “somewhere” in the object name, you could use this one. Enjoy!
↧
Blog Post: A quick way to deploy your Azure Functions in the cloud
After my last webcast about Azure Functions , I received an interesting question: how can I quickly deploy a function to Azure? Or maybe to different Azure subscriptions or regions? We saw in the webcast how you can deploy an Azure Function by using Visual Studio or Visual Studio Code directly. But Azure Functions have also a full range of continuous deployment and integration options provided by Azure App Service and you can deploy an Azure Function directly from a CI/CD pipeline in Azure DevOps. But there’s also a quickest way in my opinion: Zip deployment . With Zip deployment , you can create a .zip package of your function’s files and then publish it directly to Azure by using Azure CLI or Powershell or the zipdeploy REST API (available to the endpoint https://<app_name>.scm.azurewebsites.net/api/zipdeploy ). In this post, I want to show you how you can use Zip deployment for deploying your Azure Functions quickly and on multiple subscriptions or regions. For this demo, I’ve created a very simple HTTPTrigger Azure Function with Visual Studio by using the standard template: When your function is ready to be deployed, right click the project, select Publish and then select Folder as the publish target: In this way, Visual Studio creates a publish folder with all the content (files, binaries, DLLs and so on) that must be published on Azure in order to execute your function: Select all the files and folder inside the publish folder and create a .zip package. This will be the ZIP archive that you will deploy on Azure directly. Step 1 : It declares some variables, like the Azure region where to deploy the function, the Resource Group to use (or create), the Storage Account to create, the name of the Azure Function and the full path of the .zip archive to deploy: $location = "westeurope" $resourceGroupName = "demozipdeployrg" $storageAccountName = "demozipdeploysa" $functionName = "DemoZipDeploy" $sourceZipPath = "C:\SD\source\repos\DemoZipDeploy\bin\Release\netcoreapp2.1\publish\DemoZipDeploy.zip" Step 2 : it creates a resource group: az group create --name $resourceGroupName --location $location Step 3 : it creates a storage account: az storage account create --name $storageAccountName --location $location --resource-group $resourceGroupName --sku Standard_LRS Step 4 : it creates a new function app: az functionapp create --name $functionName --storage-account $storageAccountName --consumption-plan-location westeurope --resource-group $resourceGroupName --functions-version 2 Step 5 : it publish the function on Azure by using the ZIP package: az webapp deployment source config-zip -g $resourceGroupName -n $functionName --src $sourceZipPath That’s it! Your Azure Function is deployed and it’s up and running in the time of a click…
↧
Forum Post: RE: find Azure tenant ID from Business central code
Hello, check this out : https://docs.microsoft.com/en-us/dynamics365/business-central/dev-itpro/developer/methods-auto/database/database-tenantid-method ID := Database.TenantId() best regards, Thomas Barbut
↧
↧
Call for Microsoft Dynamics NAV/BC Functional Consultants to learn how we can address AEC Industry Business Demands
↧
Blog Post: Dynamics 365 Business Central SaaS: save a file to an SFTP server
In our recently released “ Mastering Dynamics 365 Business Central ” book, in the Azure Function chapter I’ve provided a full example on how to upload and download a file to Azure Blob Storage from a SaaS environment (this was one of the top request I’ve received on all my trainings this year). But many of you have also raised a new more request: in a Dynamics 365 Business Central SaaS environment, how can I save a file to an SFTP server? This is an operation that you cannot do directly from a SaaS tenant, simply because from here you don’t have access to local resources and you cannot execute custom code. In this blog post I want to give you a possible solution that involves using a C# Azure Functions . The Azure Function that we’ll use for this task is an HttpTrigger with a function called UploadFile defined as follows: [FunctionName("UploadFile")] public static async Task Upload( [HttpTrigger(AuthorizationLevel.Function, "post", Route = null)] HttpRequest req, ILogger log) { log.LogInformation("C# HTTP trigger function processed a request."); string requestBody = await new StreamReader(req.Body).ReadToEndAsync(); dynamic data = JsonConvert.DeserializeObject(requestBody); string base64String = data.base64; string fileName = data.fileName; string fileType = data.fileType; string fileExt = data.fileExt; Uri uri = await UploadBlobAsync(base64String, fileName, fileType, fileExt); //Upload to SFTP fileName = await UploadFileToSFTP(uri, fileName); return fileName != null ? (ActionResult)new OkObjectResult($"File {fileName} stored. URI = {uri}") : new BadRequestObjectResult("Error on input parameter (object)"); } The skelethon of this function is very similar to the sample provided in my book. The function receives a POST request with a JSON object that contains the file data (Base64) to upload and then other parameters like the name of the file, the file type and the file extension. Then, this function uploads the file to an Azure BLOB Storage container (here called d365bcfiles , but this could be a parameter) by calling the UploadBlobAsync method. This method is defined as follows: public static async Task UploadBlobAsync(string base64String, string fileName, string fileType, string fileExtension) { string contentType = fileType; byte[] fileBytes = Convert.FromBase64String(base64String); CloudStorageAccount storageAccount = CloudStorageAccount.Parse(BLOBStorageConnectionString); CloudBlobClient client = storageAccount.CreateCloudBlobClient(); CloudBlobContainer container = client.GetContainerReference("d365bcfiles"); await container.CreateIfNotExistsAsync( BlobContainerPublicAccessType.Blob, new BlobRequestOptions(), new OperationContext()); CloudBlockBlob blob = container.GetBlockBlobReference(fileName); blob.Properties.ContentType = contentType; using (Stream stream = new MemoryStream(fileBytes, 0, fileBytes.Length)) { await blob.UploadFromStreamAsync(stream).ConfigureAwait(false); } return blob.Uri; } Then, when the file is uploaded to the Azure BLOB Storage, the function calls the UploadFileToSFTP method that is responsible to take this file and upload it to an SFTP server. This method is defined as follows: private static async Task UploadFileToSFTP(Uri uri, string sourceFileName) { string storageAccountContainer = "d365bcfiles"; string storageConnectionString = BLOBStorageConnectionString; string sourceFileAbsolutePath = uri.ToString(); //SFTP Parameters (read it from configurations or Azure KeyVault) string sftpAddress = "YOUR FTP ADDRESS"; string sftpPort = "YOUR FTP PORT"; string sftpUsername = "YOUR FTP USERNAME"; string sftpPassword = "YOUR FTP PASSWORD"; string sftpPath = "YOUR FTP PATH"; string targetFileName = sourceFileName; var memoryStream = new MemoryStream(); CloudStorageAccount storageAccount; if (CloudStorageAccount.TryParse(storageConnectionString, out storageAccount)) { CloudBlobClient cloudBlobClient = storageAccount.CreateCloudBlobClient(); CloudBlobContainer cloudBlobContainer = cloudBlobClient.GetContainerReference(storageAccountContainer); CloudBlockBlob cloudBlockBlobToTransfer = cloudBlobContainer.GetBlockBlobReference(new CloudBlockBlob(uri).Name); await cloudBlockBlobToTransfer.DownloadToStreamAsync(memoryStream); } var methods = new List (); methods.Add(new PasswordAuthenticationMethod(sftpUsername, sftpPassword)); //Connects to the SFTP Server and uploads the file Renci.SshNet.ConnectionInfo con = new Renci.SshNet.ConnectionInfo(sftpAddress, sftpPort, new PasswordAuthenticationMethod(sftpUsername, sftpPassword)); using (var client = new SftpClient(con)) { client.Connect(); client.UploadFile(memoryStream, $"/{sftpPath}/{targetFileName}"); client.Disconnect(); return targetFileName; } } For connecting to the SFTP server here I’m using a free library (available as a NuGet package directly from Visual Studio) that is called SSH.NET. Calling this function from an AL extension is quite simple and it’s exacly the same sample that you can find in my book. In this sample, I have a codeunit with a method called UploadFile that permits you to select a local file and upload it. Obviously, you can avoid the “local upload” piece of code and use the same method to pass a dynamically generated file (a report and so on). The AL code is as follows: procedure UploadFile() var fileMgt: Codeunit "File Management"; httpClient: HttpClient; httpContent: HttpContent; jsonBody: text; httpResponse: HttpResponseMessage; httpHeader: HttpHeaders; fileName: Text; fileExt: Text; InStr: InStream; base64Convert: Codeunit "Base64 Convert"; begin UploadIntoStream('Select a file to upload', '', 'All files (*.*)|*.*', fileName, InStr); fileExt := fileMgt.GetExtension(fileName); jsonBody := ' {"base64":"' + base64Convert.ToBase64(InStr) + '","fileName":"' + fileName + '.' + fileExt + '","fileType":"' + GetMimeType(fileName) + '", "fileExt":"' + fileMgt.GetExtension(fileName) + '"}'; httpContent.WriteFrom(jsonBody); httpContent.GetHeaders(httpHeader); httpHeader.Remove('Content-Type'); httpHeader.Add('Content-Type', 'application/json'); httpClient.Post(BaseUrlUploadFunction, httpContent, httpResponse); //Here we should read the response to retrieve the URI message('File uploaded.'); end; Some notes about this implementation: the file is not removed from the Azure Blob Storage container (because I want this behaviour). If you want, you can remove it after the SFTP upload. If the file is saved into the Blob Storage but the SFTP upload fails for some reasons, the file is not uploaded again (there’s no a retry logic). You need to restart the UploadFile action. For this reason, you could consider to create a TimerTrigger Azure Function that every N unit time checks the Azure Blob Storage for files and then (if not empty) upload them to SFTP by calling the UploadFileToSFTP method. Use Azure Key Vault to store all your credentials. Hope it helps many of you… happy coding
↧
Blog Post: Business Central Spring 2019 Update (BC14) CU10 TearDown
Business Central 14 CU10 (Application version 14.11.41204) TearDown This CU seems to be quite small update. There is no new objects this time, and there is only 70 of "NAVW114.11" marked objects. Statistics below: There is very little changes, some new functions have been introduced in order to have better structured tests, for example Sales and Purchase line have now function FindOrCreateRecordByNo, which does some preliminary checks before using FindRecordManagement. Also a lot of changes relate to removed IsBlankNumber boolean check on a lot of document lines. IsBlankNumber used to be set if "No." field is empty or IsCommentLine boolean was set. Now MS trusts only on IsCommentLine Boolean instead. E-mail validation has been changed to use a facade function to prevent error in some circumstances. Table changes Table 740 VAT Report Header has two new fields: Additional Information;Code50 and Created Date-Time;DateTime. These are used when printing the report for additional information. Power BI tables (6301, 6302 and 6307) have a new field: EmbedUrl;Text250, which is used to cache when selecting reporting. Report changes Report 1307 Standard sales - Credit memo now can also print Work Description field. This is nice. Codeunit changes Developers should notice that if you use Codeunit 10 Type Helper, there has been a change in Option field handling. If Option string is erraneously inserted without trailing space as ",Option1,Option2" instead of correct format " ,Option1,Option2", TypeHelper codeunit no longer returns 0 when checked with GetOptionNo function. Only correct format with trailing space is now considered as 0 value. Codeunit 980 Payment Registration Mgt. now correctly handles Credit Memos as Refund type instead of Payment. This removes the need for our standard change for document type. Developers should notice this and change their reports according to this if you have created a report that uses this field. Page changes Interestingly Microsoft has changed Customer and Vendor list page filters so that SETFILTER is changed to use SETRANGE. Perhaps there is some performance advantages to this change. maybe worth looking more carefully later: SETFILTER("Date Filter",'..%1',WORKDATE); -> SETRANGE("Date Filter",0D,WORKDATE); //urpok
↧
Blog Post: Listen to my interview in the first podcast with Martin Karlowitsch from Netronic
Martin Karlowitsch from Netronic interviewed me for his first podcast last week. I know that Martin has planned many other interviews with interesting people. I can't wait to hear them as they come. You can hear mine here: https://blog.netronic.com/manufacturers-think-processes-not-features-when-you-start-with-an-erp
↧
↧
Blog Post: Dynamics 365 Business Central SaaS: save a file to an SFTP server (the Logic App way)
Yesterday I’ve provided a solution for saving a file generated directly from a Dynamics 365 Business Central SaaS tenant to an SFTP server by using Azure Functions . I’ve to admit that this is my preferred way because it gives me more freedom, scalability and adaptability. But obviously, that’s not the only possible way to do so. Today, I want to describe another possible solution, absolutely reliable and that involves less coding skills: using Azure Logic Apps . Azure Logic Apps make possible to execute custom workflow in the cloud without too much programming skills. You can create your workflows directly via a graphical front-end embedded in the Azure Portal and while you can see Azure Functions like a code triggered by an event, Logic Apps are instead a workflow triggered by an event. From a Logic App you can use connectors to connect to a wide range of applications and cloud services. Logic App is the “big brother” of Power Automate (Power Automate is built using Logic App itself, it’s hosted on Office 365 instead of Azure and is less scalable than Logic Apps). How can we solve the “Dynamics 365 Business Central to SFTP” task by using Azure Logic Apps then? The first part of the solution is described in yesterday’s post: you need to create an Azure Function for uploading a file from Dynamics 365 Business Central to an Azure Blob Storage container. Doing that, you can start creating a Logic App via the Azure Portal that uses the SFTP – SSH connector in order to download the uploaded file and transfer that file to the SFTP. This Logic App can be triggered automatically when a blob is added or modified in the Azure Blob Storage container. To start creating the Logic App, go to the Azure Portal and create a new Logic App (here called SFTPBlobTransfer ): Then, in the Logic App Designer select Blank Logic App : You can now create a trigger for your workflow. Select When a blob is added or modified (properties only) and then select the Azure Blob Storage container to monitor (here called d365bcfiles ): Then you need to add a trigger to get the blob content from the path of the container: and then you need to add a new action to create a file by using the SFTP – SSH connector. Select the Create file action: Now, as a first step you need to provide the SFTP details for the connection: and then you can set up the Create file action by using the provided dynamic properties for the file name and the file content: That’s all. Save the project and your Logic App is active and ready to be triggered. To test it, just upload or modify a file inside the container and your workflow will be triggered. You can monitor a running workflow directly in the portal: More easy than yesterday’s solution isn’t it? As a side note, Azure Logic Apps can be created and deployed also directly from Visual Studio.
↧
Forum Post: The remote certificate is invalid according to the validation procedure
Hi EveryBody I am calling the service from a 3rd party when the following error message appears: People tell me whether the error is from the 3rd party web service or the NAV. And please give me a solution and fix this problem. I use NAV 2013 Thanks so much
↧
Forum Post: RE: The remote certificate is invalid according to the validation procedure
It looks like is 3rd party error.
↧
Blog Post: Microsoft Dynamics 365 Business Central 2020 Wave 1 is (almost) released!
Indeed – I didn’t see any official statement yet – but it’s obvious, v16 is the current latest MS release .. and if you don’t believe me – just check Docker (the latest “current” release is already v16 – docker image “mcr.microsoft.com/businesscentral/onprem”) … I’m not going to bore you with what is already online – I will simply point you to the resources I could find today: Dynamics 365: plan voor wave 1 van 2020 What’s new and planned for 365 Business Central Business Central on Docs Business Central on Learn The download link for Wave 1 (still under construction when writing this post) And – of course – the post I wrote about it earlier this year also contains some information and links ;-). Expect work to be done! At this point, I’m moving our 18 apps to v16, and want to comply with the bunch of extra coderules and new concepts Microsoft has foreseen in this post … . I promise you – a LOT of work :(. But more about that in a next blogpost ..
↧
↧
Blog Post: Utilize the down-time to enhance your skills
During these critical times, everybody is sitting at home and activity is often on the low-burner. Maybe it is time to utilize that time to upgrade your knowledge about the Power platform, Azure, Security, BC application or maybe even AL development. Therefore, Readynez have launched a number of one-day courses to hone your skills within: Dynamics 365 & Power Platform with Julian Sharpe Dynamics 365 Business Central with Me Cloud & Security with Jens Gilges Security with Kevin Henry This is a super initiative made by Readynez and it will help you pass the time in these special times. So, follow the link to see which courses are available. If you have any other suggestions, then please let me know.
↧
Blog Post: How to REALLY rename all AL-files of your Business Central app..
You remember this post ? I tried to warn you that when v16 comes out, there will be a new code rule that will check your filenames – and you’ll have to (if you don’t disable it) comply with the file name convention of Microsoft. If you don’t automate your file naming, then you’re in for some .. uhm .. challenges. I just made sure that the automation of the filenames complied with Microsoft’s rules .. . I need to correct my store in that post though. I had been working on this “RenameWithGit” setting, which didn’t work with multiroot workspaces and had some other stability problems. Only after my post – thanks to a reaction from James Pearson on twitter – I learned there is a much simpler way to do this. First of all … Forget about the “RenameWithGit” setting Indeed – just forget I ever built it. I’m actually thinking of taking it away in the near future. I already removed it from all my workspaces, and I strongly recommend you to do the same. It doesn’t work like it’s supposed to work .. and I’m embarrassed enough about it ;-). There is only one word you need to remember .. Staging! All you have to do after you renamed all objects is to stage the changes. That’s it. And then you’ll see that actually everything is just fine.. . I found some kind of description about this ability here: https://stackoverflow.com/questions/29706086/how-to-stage-a-rename-without-subsequent-edits-in-git . If you’d rename a file, you will get a delete and a new untracked file in Git, like you can see here: When you stage the file in vscode , you get the rename: What it actually does in a stage is it will compare the files, and when more than 50% is the same, it will indicate it as a “rename” it in stead of deleting the old name, and creating a new file. That’s smart! And yes, indeed .. I have been immensely wasting my time on the “RenameWithGit” setting :(. Can I make sure everyone always stages before commit? Well .. It’s actually good practice to always “intentionally” stage. You must have seen this message already: In VSCode, it’s called “smartcommit”. But honestly, in my opinion, the smartest commit is an intentional commit. I don’t like this message, and I switch it off by setting this up in my user settings: “git.suggestSmartCommit”: false I’m not forcing you to do so .. but this way, you can easily check in VSCode if the rename of the file, was actually a rename of the file – and not deleted and new files. Like it was intended. So, what is now the safest workflow to rename my files? Quite the same as I mentioned in my previous post about this – but a bit different. 1. Create a new branch Yep, I still recommend to do the entire process in a separate branch. Of course. It will give you a way out if you messed up ;-). 2. Change the setup The setup is actually very similar as in my previous post, only now with the “RenameWithGit” = false. To match the file name convention of Microsoft .. this is what I would use: "CRS.FileNamePattern": " . .al", "CRS.FileNamePatternExtensions": " . .al", "CRS.FileNamePatternPageCustomizations": " . .al", "CRS.OnSaveAlFileAction": "Rename", "CRS.RenameWithGit": false, "CRS.ObjectNameSuffix": " WLD", Alternatively, you could add the “RemovePrefixFromFilename” or “RemoveSuffixFromFilename” – but make sure you set up the mandatoryAffixes-setting in the AppSourceCop.json as well, for the CodeCop to accept the removal of the prefix or suffix. 3. Commit This commit is there because you might want to revert after a failed rename-attempt. 4. Rename all This is the same “Rename All” function I talked about in my previous post: It will rename all files of the current active workspace (the active workspace is the workspace of the currently activated document (file)) – not all workspaces. So you probably have to do this for all workspaces separately. Thanks to the “RenameWithGit” to false, I expect no mistakes. I was able to apply this to +6000 files today .. so it’s tested, I guess ;-). 5. Stage This is THE step I was talking about earlier. Here you can check if the rename was successful – all renamed files should indicate an “R”, like: When you see that – you’re good to go and … 6. Commit Just commit, push and create a pullrequest to the necessary branch .. and you should be done! Wait .. So I can do this in a multiroot workspace as well? Yes indeed – this flow does work in a multiroot workspace. Do execute it for all workspaces separately though, like mentioned before. That’s how I implemented it.. . Conclusion It’s a piece of cake. Really. So just do it, comply with the naming convention, and don’t feel like you’re cheating on Microsoft ;-).
↧
Blog Post: Remember that Microsoft changed the PartnerSource, Readiness and Learning platforms March 2020
Remember that Microsoft retired several platforms in March 2020: PartnerSource Dynamics Learning Portal Besides that there are changes to the Readines site: Operational Readiness moving to partner.microsoft.com The content for the retired sites has been moved to other areas: Content Area New Destination Home Microsoft partner website page Readiness & Training Microsoft Learn Partner Essentials Microsoft partner website page Sales & Marketing Microsoft partner website resources page Pricing & Licensing PartnerSource Business Center (PSBC) Deployment CustomerSource Support Partner support page The PartnerSource Business Center remains and has been extended with: Price Sheets Licensing Policies Promotions Content Pricing and Ordering News Agreements Partner Only Downloads This way you don't have to google every time you need some Microsoft stuff Read more about it here: https://mbs.microsoft.com/partnersource/global/news-events/news/PartnerSource_Retirement
↧