With latest Business Central Admin center, we can check if there are any latest version of app available in AppSource/Marketplace. 1. Go to Business Central Admin Center 2. Click on environment that you want to check for updates and click on it (ex: Production) 3. Click on Manage Apps 4. Here we can see if there is any new version available in “Latest Available Version” field and option install in “Available Update Action” 5. In below example screenshot, we can notice that there is a new version 5.3.0.1 available Progressus Software and we can install by clicking “Install Update” 6. We get confirm notification when we click on “Install Update” 7. Update will be scheduled once we click on Yes 8. Will start updating 9. And finally will be updated
↧
Blog Post: How to upgrade AppSource/Marketplace Apps without uninstalling and installing again
↧
Blog Post: Microsoft Dynamics 365: 2020 release wave 2 plan
That’s right. It’s time again for the next round of features that Microsoft is planning for the next major release. It’s weird this time, lacking most info from conferences .. the kind of “silent” release of Wave 1 .. it’s almost like flying blind. Although, there is a crapload of information online. And of course, don’t forget Microsoft’s Virtual Conference from June 3rd. Since I’m still focusing on Business Central – I’m only going to cover that part .. but do know that the entire “Dynamics 365” stack has a release for Wave 2 . Business central-related information can be found here: https://docs.microsoft.com/en-us/dynamics365-release-plan/2020wave2/smb/dynamics365-business-central/planned-features As it doesn’t make sense to just name all features (as they are all listed on the link above), I’m just going to talk again about the features I’m looking forward to (and why) – and the ones that I’m kind of less looking forward to. What am I looking forward to? As always – most probably this is going to be somewhat tech-focused .. sorry .. I am what I am, I guess. Service-to-service authentication for Automation APIs Very much looking forward to that – just because of the possibilities that we’ll have with DevOps, because at this point, supporting a decent release flow in DevOps to an environment that is fully “Multi Factor Authentication” – well – that’s a challenge. For me, this has a very high priority. Support for an unlimited number of production and sandbox environments Today, business can only be in three countries, because we can only create 3 production environments. That obviously doesn’t make sense – so absolute a good thing from Microsoft to open this up! Next to that… Business Central Company Hub extension That sounds just perfect! It seems they are really taking into account that switching companies is not a “per tenant” kind of thing, but really should be seen across multipole tenants. It seems it’s going to be built into the application, within a role center of a task page. At some point, Arend-Jan came with the idea to put it in the title bar above Business Central like this: Really neat idea that I support 100% :-). As long as it would be across multiple tenants/localizations .. :-). May be as an extension on the Company Hub? Who knows.. . Any solution, I’m looking forward to! I couldn’t find the extension in the insider-builds – so nothing to show yet.. . Business Central in Microsoft Teams Now, doesn’t THAT sound cool? Because of the COVID-19 happenings, our company – like many other companies out ther – has been using Teams a lot more than they were used to. And the more I set up Teams, the more I see little integrations with Business Central could be really useful! What exactly they are envisioning here, I don’t know, but the ability to enter timesheets, look up contact information to start a chat or call or invite or… . Yeah – there are a lot of integration-scenarios that would be really interesting.. . Common Data Service virtual entities I’m not that much into the Power-stuff (fluff?) just yet, but I can imagine that if I would be able to expose my own customizations, or any not out-of-the-box entities to CDS, that it would be possible to implement a lot more with Power Apps and other services that connect to the CDS entities. Performance Regression and Application Benchmark tools One of the things we are pursuing is the ability for DevOps to “notice” that things are getting slower. This means that we should be able to “benchmark” our solution somehow. So I’m looking forward diving into these tools to see if they can help us achieve that goal! Pages with FactBoxes are more responsive Role Centers open faster These are a few changes in terms of client performance – and what’s not to like about that ;-). I have been clicking through the client, and it definitely isn’t slower ;-). I also read somewhere that caching of the design of the pages is done much smarter .. even across sessions, but I didn’t seem to find anything that relates to that statement here on the list. On-demand joining of companion tables So so important. Do you remember James Crowter’s post on Table Extensions ? Well, one of the problems is that it’s always joining these companion tables. I truly believe this can have a major impact on performance if done well. Restoring environments to a point in time in the past I have been advocating strongly against “debug in live” – well, this is one step closer to debugging with live data, but not in the production environment. Also this is a major step forward for anyone supporting Business Central SaaS! Attach to user session when debugging in sandbox Sandboxes are sometimes used as User Acceptance Test environments. In that case, multiple users are testing not-yet-released software, and finally, we will be able to debug their sessions to see what they are hitting. Debug extension installation and upgrade code Finally! I have been doing a major redesign of our product, and would have really enjoyed this ability ;-). Nevertheless, I’m very glad it’s finally coming! No idea how it will work, but probably very easy ;-). What am I not looking forward to? Well, this section is not really the things I don’t like, but rather the things I wasn’t really looking forward to as a partner/customer/.. . I don’t know if it makes any sense to make that into a separate section .. but then again .. why not. It actually all started with something that I really really hated in one of the previous releases: the ability to go hybrid / customize the Base App. And I kept the section ever since ;-). So .. this is the rest of the list of features we can expect: Administration Deprecation of the legacy Dynamics NAV Help Server component Improved overview and management of the available database and file capacity Database access intent changed to read-only for frequently used reports Application Group VAT reporting Default unit cost for non-inventory items Track packages from more types of sales documents Bank reconciliation improvements Consolidation file format support for Dynamics 365 Finance Auto-resolve Common Data Service conflicts Notify users of high-risk changes in selected setup fields Use conversion templates to convert contacts to vendors and employees Use recurring journals to allocate balances by dimension values Use the Copy Journal function on general journals and G/L registers when reversing entries Use Word document layouts to customize outgoing customer documents Use contact Mobile Phone Number and Email consistently across application Migrations to Business Central Online Continued enhancements for migrating from Dynamics GP to Business Central Historical data migration from Dynamics GP to Azure Data Lake Migrate from Business Central 14.x on-premises to Business Central 16.x online Migrate from Business Central 15.x on-premises to Business Central 16.x online Modern Clients Improved accessibility for low-vision users Access multiple production or sandbox environments from the mobile apps Basic auth settings deprecated for Contact Sync and Outlook Add-in Changes to the action bar in dialogs Update the navigation experience terminology to improve usability Updates to page styling Page Inspector supports temporary tables Seemless Service Data audit system fields are added to every table Log of admin operations in the Business Central admin center Renaming environments in the Business Central admin center Sandbox environments can be updated to a Public Preview version Developers can emit telemetry to Application Insights from AL code Extension publishers can get telemetry in Azure Application Insights General Support for latest Microsoft Dynamics 365 SDK Expanded country and regional availability Deprecation of the legacy Dynamics NAV Help Server component I have the feeling not everything is included in this list, honestly. There isn’t much mentioned on VSCode-level, while we know there is going to be quite some work in the “WITH” area .. . And we expect to have “pragmas” in code available in the next release as well – or so I understood. That’s just a couple of things you could see in the session “Interfaces and extensibility: Writing extensible and change-resilient code” session of the recent Virtual Conference of Microsoft.
↧
↧
Blog Post: Installing a DevOps Agent (with Docker) with the most chance of success
You might have read my previous blog on DevOps build agents . Since then, I’ve been quite busy with DevOps – and especially with ALOps . And I had to conclude that one big bottleneck keeps being the same: a decent (stable) installation of a DevOps Build server that supports Docker with the images from Microsoft. Or in many cases: a decent build agent that supports Docker – not even having to do anything with the images from Microsoft. You probably have read about Microsoft’s new approach on providing images being: Microsoft is not going to provide you any images any more, but will help you in creating your own images – all with navcontainerhelper. The underlying reason is actually exactly the same: something needed to change to make “working with BC on Docker” more stable. Back to Build Agents In many support cases, I had to refer back to the one solution: “run your Docker images with Hyper-V isolation”. While that solved the majority of the problems (anything regarding alc.exe (compile) and finsql.exe (import objects)) .. in some cases, it wasn’t solving anything, which only has one conclusion: it’s your infrastructure: version of windows and/or how you installed everything. So .. that made me conclude that it might be interesting to share with you a workflow that – in some perspective doesn’t make any sense – but does solve the majority of the unexplainable problems with using Docker on a Build Server for AL Development :-). Step 1 – Install Windows Server 2019 We have best results with Windows Server 2019 as it’s more stable, and is able to use the smaller images for Docker. Step 2 – Full windows updates Very important: don’t combine docker/windows updates and such. First, install ALL windows updates and then reboot the server. Don’t forget to reboot the server after ONLY installing all Windows updates. Step 3 – Install the necessary windows features So, all windows updates have applied and you have restarted – time to add the components that are necessary for Docker. With this PowerShell script, you can do just that: Install-WindowsFeature Hyper-V, Containers -Restart You see – again, you need to restart after you did this! Very important! Step 4 – Install Docker You can also install Docker with a script: Install-PackageProvider -Name NuGet -MinimumVersion 2.8.5.201 -Confirm:$false -Force Install-Module DockerProvider -Confirm:$false -Force Install-Package Docker -RequiredVersion 19.03.2 -ProviderName DockerProvider -Confirm:$false -Force You see, we refer to a specific version of Docker. We noticed not all versions of Docker are stable – this one is, and we always try to test a certain version (with the option to roll back), in stead of just applying all new updates automatically. For a build agent, we just need a working docker, not an up-to-date Docker ;-). Step 5 – The funky part: remove the “Containers” feature What? Are you serious? Well .. Yes. Now, remove the Containers feature with this script and – very important – restart the server again! Uninstall-WindowsFeature Containers Restart-Computer -Force:$true -Confirm:$false Step 6 – Re-install the “Containers” feature With a very simpilar script: Install-WindowsFeature Containers Restart-Computer -Force:$true -Confirm:$false I can’t explain why these last two steps are necessary – but it seems the installation of Docker messes up something in the Containers-feature, that – in some cases – needs to be restored.. . Again, don’t forget to restart your server! Step 7 – Disable Windows Updates As Windows updates can terribly mess up the stability of your Build Agent, I always advice to disable them. When we want to apply windows updates, what we do, is just execute the entire process described above again! Yes ineed .. again! That’s it! If you ask yourself – is all this still necessary when we moved to the new way to work with Docker: when we build our own images and such. Well – I don’t know, but one thing I do know: the problems we have had to solve were not all related to the Business Central Images – but some just also regarding just “Docker” and the way Docker was talking to Windows .. (or so we assumed). So I guess it can’t hurt to try to find a way to setup your build servers that way that you know it’s just going to work right away.. . And that’s all what I tried to do here ;-).
↧
Blog Post: Dynamics 365 Business Central: exploring page views with Azure Application Insights
I think that you already know that I’m a huge fan of using Azure Application Insights for collecting telemetry for different types of applications hosted in the cloud or hosted on-premise too. At my session at Ignite Tour 2020 in Milan I showed how you can fully monitor and debug an application hosted on Azure with Application Insights and Azure Monitor. Also Dynamics 365 Business Central permits you to collect a quite large set of telemetry signals on Azure Application Insights and in the past I’ve explained how to do it . Today I want to talk about a new interesting set of telemetry signals related to page views that, starting from version 16.3 , Dynamics 365 Business Central is sending to Azure Application Insights. Page view telemetry collects data about the pages that users open in the Business Central client. Each page view entry tells you informations about user’s page usage, like how long it took to open the page, informations about the user’s environment, and much more. In Application Insights, telemetry about page views is logged to the pageViews table. As an example, by going to the Logs section and executing the following KUSTO query, you can see the page viewed on my production tenant in the last hour, with details about the user session, the client and more: With the following KUSTO query, you can see for example your top 10 viewed pages in a particular time range: But there’s a lot more then rough data The pageViews table used by Application Insights permits you to enable also a new interesting set of features related to usage analysis and that you can find under the Usage section: Here: The Users report counts the numbers of unique users that access your pages within your chosen time periods. The Sessions report counts the number of user sessions that access your application. Events permits you to analyze how often certain pages and features of your app are used. A page view is counted when a browser loads a page from your app Funnels : If your application involves multiple stages, you need to know if most customers are progressing through the entire process, or if they are ending the process at some point. The progression through a series of steps in a web application is known as a funnel . You can use Azure Application Insights Funnels to gain insights into your users, and monitor step-by-step conversion rates. The User Flows tool visualizes how users navigate between the pages and features of your site. Retention helps you understand how often your users return to use their app, based on cohorts of users that performed some business action during a certain time bucket. Impact analyzes how load times and other properties influence conversion rates for various parts of your app. Cohorts represent is a set of users, sessions, events, or operations that have something in common. In Azure Application Insights, cohorts are defined by an analytics query (more info here ). To explain funnels , imagine that I want to analyze how many users execute a particular page flow. You can create a funnel by specifying your page flow and you can see the users progress on it: On my demo tenant unfortunately I don’t have too much user’s data, but you can see here that you can inspect the page flow, see the % of users that goes to each step of your page flow and for each step the immediately before and after page opened. The User Flows section is extremely interesting for understanding what users are doing during a session. The User Flows tool starts from an initial page view, custom event, or exception that you specify. Given this initial event, User Flows shows the events that happened before and afterwards during user sessions. Lines of varying thickness show how many times each path was followed by users. Special Session Started nodes show where the subsequent nodes began a session. Session Ended nodes show how many users sent no page views or custom events after the preceding node, highlighting where users probably left the application. As an example, this is what happens on my tenant starting from the Business Manager Role Center page: As you can see I have some session that immediately stops, others that checks the G/L Entries and G/L Accounts, other sessions that goes through Customers. Some of these page views occours before the starting event, other occours after (in the above sample, I have a session that returned to the role center after checking the chart of accounts). If you want to only analyze the page views that occours AFTER a particular event, just set the previous step number to 0: Then, the Retention analysis is one of the more fascinating in my opinion. With this feature, you can analyze how many users return to your application, and how often they perform particular tasks or achieve goals. Here is the analysis that I can see on my tenant: By default, Retention shows all users who did anything then came back and did anything else over a period. You can select different combination of events to narrow the focus on specific user activities. On this chart, the overall retention chart shows a summary of user retention across the selected time period. The grid shows the number of users retained according to the filters applied. Each row represents a cohort of users who performed any event in the time period shown. Each cell in the row shows how many of that cohort returned at least once in a later period. Some users may return in more than one period. The insights cards show top five initiating events, and top five returned events to give users a better understanding of their retention report. The Impact section can be used to analyze if a particular slow page can affect the usage of other pages in your application. More technically speaking, it discovers how any dimension of a page view, custom event, or request affects the usage of a different page view or custom event. As an example, here I’m analyzing the impact of the Business Manager Role Center page opening time on the Customer List adoption: The analysis shows that the usage of the Customer List page somewhat decreases as “duration” of the Business Manager Role Center page increases (Correlation = -0.63). Don’t forget also the More section (quite hidden), that gives you other interesting informations about how your users use your application, like details of the page usage and time spent on a page: and more: I think that you’ve understood how interesting is this set of telemetry data related to page views. You can monitor how your users works on your application, how their business flows performs, how a page slowness can impact a particular flow and so on. That’s absolutely an extremely valuable information for every partner or ISV I think, so please use it and inspect your telemetry data. Microsoft is actively working on this, and they’re doing a wonderful work…
↧
Forum Post: Get Vendor Name in report
i have a field "Purch. Rcpt. Line"."Buy-from Venor No.". i want to get a "Vendor"."Name" in report. please help me. thanks in advance.
↧
↧
Forum Post: Calculate the Amount Vendor Ledger Entry And Insert Table
Hi Everybody, I create a table with the following fields: Table Name: Cal Balance Vendor Vendor Code Ending Date Currency Code Total Amount Line No I use this table to save the results of the calculations at the end month. The purpose of reporting liabilities of Vendor. * I'm new 1 report on the data source Cal Balance Vendor use calculation by Ending Date VendLedEntries.SETCURRENTKEY("Vendor No.","Posting Date","Currency Code"); VendLedEntries.SETRANGE("Vendor No.","Vendor Code"); VendLedEntries.SETRANGE("Posting Date",0D, EndingDate); VendLedEntries.SETRANGE("Currency Code","Currency Code"); IF VendLedEntries.FINDSET THEN BEGIN REPEAT VendLedEntries.CALCSUMS(Amount); TotalBalAmount:=VendLedEntries.Amount; UNTIL VendLedEntries.NEXT =0 ; LineNo+=1000 "Cal Balance Vendor".INIT "Cal Balance Vendor"."Amount":=TotalBalAmount; "Cal Balance Vendor"."Ending Date":=EndingDate; "Cal Balance Vendor"."Currency Code":=VendLedEntries."Currency Code"; "Cal Balance Vendor"."Line No":= LineNo; "Cal Balance Vendor".INSERT END; I cannot insert data into the table. Please help me with 1 solution. Thanks very much
↧
Forum Post: RE: Get Vendor Name in report
You can use a variable VendorName: If Vendor.GET("Buy-from Vendor No.") THEN VendorName:= "Vendor"."Name"
↧
Forum Post: AL Extension for Auto-Downloading / Importing of Bank Statements into Business Central
I'm based in the UK. Is anyone aware of an AL Extension in existence for Auto-Downloading / Importing of Bank Statements into Business Central? I understand that Yodlee doesn't support UK Bank feeds anymore? Correct? Are there any alternatives? For instance, could it be split into two tasks: (1) Auto-download from bank (2) Auto-Import into BC, using different toolkits?
↧
Blog Post: Dynamics 365 Business Central: loading demo data for your extensions
I’ve received in the last days a request from a forum user asking for a way to automatically loading demo data in an extension, maybe embedding a data file inside the extension package and load it when needed, for example during the install phase. Unfortunately you cannot embed data files inside your binary .app file and read them from the binary .app file folder structure. I’ve promised a personal response and here it is. I think there are different ways of loading demo data for an extension, but the way I normally suggest is the following: Create a Configuration Package (.rapidstart file) with your demo data Save the .rapidstart file to an Azure Blob Storage publicly available Load the Rapidstart package file from your extension when needed. I’ve talked about loading Configuration Packages from AL by using Streams in the past here and here . To load demo data for your extension, you can do the same. As a first step, create an Azure Storage Account and inside this storage account create a container for hosting blobs (here called demodata ). For this container, set Public Access Level = Blob : In this way, your container will have anonymous access for reading the blob content. In this way, you can avoid to share access keys for loading demo data (obviously, this is up to you if you don’t want this but instead you prefer to add an access key). When the container is created, you can create a new Configuration Package. Here as a demo I’m using an extension that has a table called WorkItem : Create your demo data (for example from Excel) and then import the data to your package: Here I have 9 demo records. Then, export the package with your data as a .rapidstart file and then upload this file to your Azure Blob Storage container: You will now have a public url for accessing your package file in the form of https://STORAGEACCOUNTNAME.blob.core.windows.net/CONTAINERNAME/PACKAGENAME Now the code part. On my extension, I create an Import Demo Data action for loading the demo data from the Rapidstart package (but you can trigger it also automatically from your code): This code calls a codeunit that does all the data loading process. The code is as follows: I think it’s quite self-explanatory. By using the HttpClient object it downloads the package from the storage account to a stream and then it calls the ImportAndApplyRapidStartPackageStream method of the Config. Package – Import codeunit for loading and applying the package to the database. The result: Your demo data are automatically loaded to your extension. As said before, you can load demo data to an extension with different ways, but the way described here is normally my first suggestion because: It requires only few lines of AL code and no other external objects You can create your demo data easily with Excel and a Configuration Package (so you can demand this task to every consultant) You can refresh the demo data when you want easily (just replace the package with the updated one)
↧
↧
Blog Post: Dynamics 365 Business Central: loading demo data for your extensions
I’ve received in the last days a request from a forum user asking for a way to automatically loading demo data in an extension, maybe embedding a data file inside the extension package and load it when needed, for example during the install phase. Unfortunately you cannot embed data files inside your binary .app file and read them from the binary .app file folder structure. I’ve promised a personal response and here it is. I think there are different ways of loading demo data for an extension, but the way I normally suggest is the following: Create a Configuration Package (.rapidstart file) with your demo data Save the .rapidstart file to an Azure Blob Storage publicly available Load the Rapidstart package file from your extension when needed. I’ve talked about loading Configuration Packages from AL by using Streams in the past here and here . To load demo data for your extension, you can do the same. As a first step, create an Azure Storage Account and inside this storage account create a container for hosting blobs (here called demodata ). For this container, set Public Access Level = Blob : In this way, your container will have anonymous access for reading the blob content. In this way, you can avoid to share access keys for loading demo data (obviously, this is up to you if you don’t want this but instead you prefer to add an access key). When the container is created, you can create a new Configuration Package. Here as a demo I’m using an extension that has a table called WorkItem : Create your demo data (for example from Excel) and then import the data to your package: Here I have 9 demo records. Then, export the package with your data as a .rapidstart file and then upload this file to your Azure Blob Storage container: You will now have a public url for accessing your package file in the form of https://STORAGEACCOUNTNAME.blob.core.windows.net/CONTAINERNAME/PACKAGENAME Now the code part. On my extension, I create an Import Demo Data action for loading the demo data from the Rapidstart package (but you can trigger it also automatically from your code): This code calls a codeunit that does all the data loading process. The code is as follows: I think it’s quite self-explanatory. By using the HttpClient object it downloads the package from the storage account to a stream and then it calls the ImportAndApplyRapidStartPackageStream method of the Config. Package – Import codeunit for loading and applying the package to the database. The result: Your demo data are automatically loaded to your extension. As said before, you can load demo data to an extension with different ways, but the way described here is normally my first suggestion because: It requires only few lines of AL code and no other external objects You can create your demo data easily with Excel and a Configuration Package (so you can demand this task to every consultant) You can refresh the demo data when you want easily (just replace the package with the updated one)
↧
Blog Post: Business Central Spring 2019 Update (BC14) CU14 TearDown
Cumulative Update 14 for Microsoft Dynamics 365 Business Central April'19 on-premises (Application Build 14.15.43800, Platform Build 14.15.43793) TearDown This update does not require database conversion from previous version, and there is no known security vulnerabilities fixed in this Platform version. The ChangeLog for Finnish localization weighs 1,9mb, and there is 160 changed tables and 141 changed reports (!), so this package seems to be very interesting on the first look. Spoiler alert! Most of the changes are related to some corrections in translations in Finnish version, so this package is not as big as it firstly seems to be. For example translation for “Posting date” which used to be translated as “Kirjauspvm.” is now translated without dot as “Kirjauspvm”. This is now in align with other date translations that have been without the ending dot. Removing the extra dot has caused like 95% of the changes in Finnish localization package. I am happy MS has changed this, since the special characters in field names always make it a bit difficult to create integrations, because Web Services show them as underscores. This makes it impossible to have for example “Profit%” and “Profit.” on the same page, since both of these fields would be shown as “Profit_” on published WS page. Also instead of translating Assets as “Käyttöomaisuus”, it is now translated as “Vastaavaa” which is a correct accounting term in Finnish. Käyttöomaisuus would translate only to “Fixed assets” when Assets is a bit broader term. There is total 356 changed objcets from the previous CU, and no new objects in this release. VAT Rate Change Tool Vat Rate Change Tool has learned some new tricks. Now it can also change VAT prod. posting groups on lines that have checkmark “Prices Including VAT” and are of type G/L Account, Charge (Item) or Fixed Asset. Table 550 VAT Rate Change Setup has three new fields: 110 ; ;Update Unit Price For G/L Acc.;Boolean; 111 ; ;Upd. Unit Price For Item Chrg.;Boolean; 112 ; ;Upd. Unit Price For FA;Boolean Generic Data exchange functionality Table 1225 Data Exch. Field Mapping has one new field, which is also inserted to secondary key. This is used when the order of processing the exchanged fields is considered. 30 ; ;Priority ;Integer Other stuff Virtual Microsoft Inspire -event is live today, please check this page https://partner.microsoft.com/en-US/inspire/ for registering and attend for the latest information over Dynamics family! Have a nice summer and be safe!
↧
Forum Post: RE: Get Vendor Name in report
I will be happy to ask first, What did you tried so far?
↧
Blog Post: AL language and the “missing documentation” warning
Yesterday evening I’ve received a message from a user asking me if something has changed in the AL language extension. When he loaded an AL project in Visual Studio Code, he started receiving tons of information messages like “ The procedure XXX missing documentation “: Is really something changed in the AL language extension? Maybe something related to the announced plans for XML code comments? No… nothing related about it. What causes this “issue” is a new update of the AL XML Documentation extension. If you have previously installed this extension or if you’re using my extension’s pack for Dynamics 365 Business Central (where this extension is one of the default installed toolset), after the update you start receiving this “warning” (although it’s not a warning but a simple information message). You can simply disable it by going to the AL XML Documentation extension’s settings and uncheck the following option: Or if you prefer the JSON file directly, set as follows:
↧
↧
Forum Post: How to read No. of characters in a .csv file
Hi Folks - I have a XMLport reading a .CSV file which is comma separated. When each line is imported how do I check for the No. of characters in the line. In the below example I have 30 characters in the second line of the spreadsheet and second line I have 35 characters due to Column G, as and when the line is imported if the no. of characters exceeds 30 characters I want to skip the line. Is it doable? I would truly appreciate if someone knows how it can be done. Any help is truly appreciated. Thanks RJ.
↧
Forum Post: Filtering oData page
Hi everyone, I have published a page as a webService. In that page, I show one variable that is calculated in the onAfterGetRecord event of the page. But I see that when I try to filter with the URI that variable, is nto working. Just works filtering with the field that are in the table. Is it possible to filter from that calculated variable? basically I've created a variable that is the customer number, and want to filter the contact with a known custoemr number, something like this: XXXXX:7058/.../ContactAPI eq '000010' Thank you
↧
Forum Post: RE: Filtering oData page
Sorry the URL is like this: http:/XXXX:7058/DynamicsNAV110/ODataV4/Company('XXXX.')/ContactAPI?$filter=gCustNo eq '000010'
↧
Blog Post: Creating a local drive mapped with an Azure File Share for your files in the cloud
File Management in the cloud is always an hot topic when using SaaS services like Dynamics 365 Business Central. A common request that I always see popping up on forums or from partners and customers is if it’s possible to automatically save a file on a local drive from Dynamics 365 Business Central (or more in general, from SaaS applications). As you can imagine, from a SaaS tenant you don’t have access to the local file system (so you cannot directly save a file into your C: drive or on your local network share folder). I’ve talked in the past about this topic (our Mastering Dynamics 365 Business Central book provides a full solution for this task) and I’ve also shared with you a solution on this blog that permits you to save a file from Dynamics 365 Business Central to a cloud storage (like Azure Blob Storage) and then also to an SFTP server. I know that many of you are using the solution provided in our book, but a question that sometimes I receive is: can I map this cloud storage to a local drive, so that my users can manage files transparently by working exactly as they are used to do with the files on their local machine? As standard, you cannot map an Azure Blob Storage container as a local drive. For this scope, you can use Azure Files that has support for mapping drives to both local and azure hosted systems. Usage is exactly the same as I’ve described in my book or in my blog post linked above. Azure Files and Azure Blob storage both offer ways to store large amounts of data in the cloud, but they are useful for slightly different purposes. Azure Blob storage is useful for massive-scale, cloud-native applications that need to store unstructured data. To maximize performance and scale, Azure Blob storage is a simpler storage abstraction than a true file system. You can access Azure Blob storage only through REST-based client libraries (or directly through the REST-based protocol). Azure Files is specifically a file system in the cloud. Azure Files has all the file abstracts that you know and love from years of working with on-premises operating systems. Like Azure Blob storage, Azure Files offers a REST interface and REST-based client libraries. Unlike Azure Blob storage, Azure Files offers SMB access to Azure file shares. By using SMB, you can mount an Azure file share directly on Windows, Linux, or macOS, either on-premises or in cloud VMs, without writing any code or attaching any special drivers to the file system. You also can cache Azure file shares on on-premises file servers by using Azure File Sync for quick access, close to where the data is used. So, what’s the full solution that normally I use to satisfy this requirement? To upload a file to an Azure File Share , simply modify the UploadFile Azure Function from the previous post in order to call a new UploadBlobAsyncToFileShare function instead of the previous UploadBlobAsync . The new UploadBlobAsyncToFileShare function is defined as follow (it works with a CloudFileShare object instead of a CloudBlobContainer object, please not that to reduce the code lines here I’ve not handled logging and exceptions): public static async Task UploadBlobAsyncToFileShare(string base64String, string fileName, string fileType, string fileExtension, string folderName) { string fileShareConnectionString = "CONNECTION_STRING"; string shareName = "share_name_lowercase"; string contentType = fileType; byte[] fileBytes = Convert.FromBase64String(base64String); CloudStorageAccount storageAccount = CloudStorageAccount.Parse(fileShareConnectionString); CloudFileClient client = storageAccount.CreateCloudFileClient(); CloudFileShare share = client.GetShareReference(shareName); //Create the share if it doesn not exists await share.CreateIfNotExistsAsync(); //Reference to the root directory CloudFileDirectory rootDirectory = share.GetRootDirectoryReference(); CloudFileDirectory fileDirectory = null; if (string.IsNullOrWhiteSpace(folderName)) { // There is no folder specified, so return a reference to the root directory. fileDirectory = rootDirectory; } else { // There was a folder specified, so return a reference to that folder. fileDirectory = rootDirectory.GetDirectoryReference(folderName); await fileDirectory.CreateIfNotExistsAsync(); } // Set a reference to the file. CloudFile file = fileDirectory.GetFileReference(fileName); using (Stream stream = new MemoryStream(fileBytes, 0, fileBytes.Length)) { await file.UploadFromStreamAsync(stream).ConfigureAwait(false); } return file.Uri; } Then, you need to map the file share as a local drive. You can do this task directly from the Azure Portal or you can use Azure Powershell for that. I normally prefer the Powershell way, and the script that creates the Azure File Share and them map it as a network drive is explained here. Creating the Azure File Share instance is obviously a one-time operation. First you need to create a storage account, then you can create a file share instance on that storage account and then you can map those instance to a local drive. The Powershell script that performs all this for you is as follows: When the file share is created, you have an endpoint like the following: You can map this endpoint to a local drive letter by using the New-PSDr ive cmdlet: The New-PSDrive cmdlet creates temporary and persistent drives that are mapped to or associated with a location in a data store. Here I’m using the -Persist option in order o create a persistent drive. A persistent drive will remain active also when you close the Powershell session (otherwise temporary drives exist only in the current PowerShell session and in sessions that you create in the current session and you can’t access them by using File Explorer). When mounted, you can see a new local disk (here mapped to the X: letter) and now you can use it as a normal drive: Your users can now work with files in your SaaS tenant exactly like in a local drive. The complete Powershell script that I’m using is available here .
↧
↧
Blog Post: Creating a local drive mapped with an Azure File Share for your files in the cloud
File Management in the cloud is always an hot topic when using SaaS services like Dynamics 365 Business Central. A common request that I always see popping up on forums or from partners and customers is if it’s possible to automatically save a file on a local drive from Dynamics 365 Business Central (or more in general, from SaaS applications). As you can imagine, from a SaaS tenant you don’t have access to the local file system (so you cannot directly save a file into your C: drive or on your local network share folder). I’ve talked in the past about this topic (our Mastering Dynamics 365 Business Central book provides a full solution for this task) and I’ve also shared with you a solution on this blog that permits you to save a file from Dynamics 365 Business Central to a cloud storage (like Azure Blob Storage) and then also to an SFTP server. I know that many of you are using the solution provided in our book, but a question that sometimes I receive is: can I map this cloud storage to a local drive, so that my users can manage files transparently by working exactly as they are used to do with the files on their local machine? As standard, you cannot map an Azure Blob Storage container as a local drive. For this scope, you can use Azure Files that has support for mapping drives to both local and azure hosted systems. Usage is exactly the same as I’ve described in my book or in my blog post linked above. Azure Files and Azure Blob storage both offer ways to store large amounts of data in the cloud, but they are useful for slightly different purposes. Azure Blob storage is useful for massive-scale, cloud-native applications that need to store unstructured data. To maximize performance and scale, Azure Blob storage is a simpler storage abstraction than a true file system. You can access Azure Blob storage only through REST-based client libraries (or directly through the REST-based protocol). Azure Files is specifically a file system in the cloud. Azure Files has all the file abstracts that you know and love from years of working with on-premises operating systems. Like Azure Blob storage, Azure Files offers a REST interface and REST-based client libraries. Unlike Azure Blob storage, Azure Files offers SMB access to Azure file shares. By using SMB, you can mount an Azure file share directly on Windows, Linux, or macOS, either on-premises or in cloud VMs, without writing any code or attaching any special drivers to the file system. You also can cache Azure file shares on on-premises file servers by using Azure File Sync for quick access, close to where the data is used. So, what’s the full solution that normally I use to satisfy this requirement? To upload a file to an Azure File Share , simply modify the UploadFile Azure Function from the previous post in order to call a new UploadBlobAsyncToFileShare function instead of the previous UploadBlobAsync . The new UploadBlobAsyncToFileShare function is defined as follow (it works with a CloudFileShare object instead of a CloudBlobContainer object, please not that to reduce the code lines here I’ve not handled logging and exceptions): public static async Task UploadBlobAsyncToFileShare(string base64String, string fileName, string fileType, string fileExtension, string folderName) { string fileShareConnectionString = "CONNECTION_STRING"; string shareName = "share_name_lowercase"; string contentType = fileType; byte[] fileBytes = Convert.FromBase64String(base64String); CloudStorageAccount storageAccount = CloudStorageAccount.Parse(fileShareConnectionString); CloudFileClient client = storageAccount.CreateCloudFileClient(); CloudFileShare share = client.GetShareReference(shareName); //Create the share if it doesn not exists await share.CreateIfNotExistsAsync(); //Reference to the root directory CloudFileDirectory rootDirectory = share.GetRootDirectoryReference(); CloudFileDirectory fileDirectory = null; if (string.IsNullOrWhiteSpace(folderName)) { // There is no folder specified, so return a reference to the root directory. fileDirectory = rootDirectory; } else { // There was a folder specified, so return a reference to that folder. fileDirectory = rootDirectory.GetDirectoryReference(folderName); await fileDirectory.CreateIfNotExistsAsync(); } // Set a reference to the file. CloudFile file = fileDirectory.GetFileReference(fileName); using (Stream stream = new MemoryStream(fileBytes, 0, fileBytes.Length)) { await file.UploadFromStreamAsync(stream).ConfigureAwait(false); } return file.Uri; } Then, you need to map the file share as a local drive. You can do this task directly from the Azure Portal or you can use Azure Powershell for that. I normally prefer the Powershell way, and the script that creates the Azure File Share and them map it as a network drive is explained here. Creating the Azure File Share instance is obviously a one-time operation. First you need to create a storage account, then you can create a file share instance on that storage account and then you can map those instance to a local drive. The Powershell script that performs all this for you is as follows: When the file share is created, you have an endpoint like the following: You can map this endpoint to a local drive letter by using the New-PSDr ive cmdlet: The New-PSDrive cmdlet creates temporary and persistent drives that are mapped to or associated with a location in a data store. Here I’m using the -Persist option in order o create a persistent drive. A persistent drive will remain active also when you close the Powershell session (otherwise temporary drives exist only in the current PowerShell session and in sessions that you create in the current session and you can’t access them by using File Explorer). When mounted, you can see a new local disk (here mapped to the X: letter) and now you can use it as a normal drive: Your users can now work with files in your SaaS tenant exactly like in a local drive. The complete Powershell script that I’m using is available here .
↧
Blog Post: New Upcoming Conference: DynamicsCon
I just wanted to raise some attention to a new Conference in town: DynamicsCon . Quite interesting, because it’s perfectly aligned with the current world situation regarding COVID-19 issues: it’s a virtual event .. and it’s free! I’m not saying I prefer virtual events. I don’t. But given the circumstances, I guess it makes sense – and some advantages as well: you will be able to see all content, all sessions are pre-recorded (which means: demos will work ;-)), and you can do it within your living room without losing any time on traveling. Now, the committee is handling this really well: they have been calling for speakers for a while, and many people reacted. Really anyone could submit session topics to present. As I did as well (you might have figured out already I do like to do this kind of stuff ). So how do they pick the topics/speakers? Well, anyone who registers can can vote for sessions! So, please if you didn’t register yet: do so now , and until August 1st (that’s not far out!), you can help the committee pick the topics most people want to see during the conference. The most votes will be picked! I’m not going to advertise my sessions – just pick based on topics . That makes most sense! Some highlights on the conference: It’s free It’s virtual It’s not just for Business Central. These are the tracks: 365 Power Platform 365 Finance & Operations 365 Customer Engagement 365 Business Central There will be Q&A panels during the conference Recorded sessions which will end up on YouTube! Date September 9-10
↧
Forum Post: Converting a Sales quote into a Sales Order using C/AL code
How can we convert a Sales Quote to Sales Order using Base Codeunit of NAV? and also please name the Codeunit . Thanks in advance.
↧