I'm having trouble finding the conditions for a workflow in a table form. Does anyone know where these are stored? The purpose is to either rapid start changes to the conditions or to have a report on the differences between our many workflows.
↧
Forum Post: Workflow Conditions - Where are they stored?
↧
Forum Post: RE: Workflow Conditions - Where are they stored?
There are two types of workflow steps, and respective types of conditions assigned to them. 1. On record changes - these are workflow steps where you can set up a response on value increase/decrease. Conditions are stored in the table 1524 "Workflow Rule". 2. Other steps (business events) - conditions are in the table 1523 "Workflow Step Argument". BLOB field "Event Conditions" stores a text line with all filters applied to event table and its related tables. "Event table" here is the value of the field "Table ID" of the "Workflow Event". Related tables are identified by the setup in table 1515 "Dynamic Request Page Entity".
↧
↧
Blog Post: Dynamics 365 Business Central: using OData V4 Bound Actions
I’ve promised this post to some attendees of my last Dynamics 365 Business Central development workshop in Microsoft Italy (c/o Microsoft House) last week. Question was: How can I call Dynamics 365 Business Central logic from an external application? Simple answer given by all: you can publish a codeunit as web service and use the SOAP endpoint. But if I want to use OData? You cannot publish a codeunit as an OData endpoint. Answer: you can call custom Dynamics 365 Business Central functions via ODataV4 by using Bound Actions . Bound Actions are actions related to an entity that you can call via an HTTP call and they can invoke your business logic (functions). Unfortunately, documentation about how to use OData V4 bound actions with Dynamics 365 Business Central is quite poor and with this post I would like to help clearing this topic a bit more. There are two main scenarios that I want to cover here: Calling a D365BC codeunit that performs business logic, like creating new entities Calling a procedure in a D365BC codeunit by passing parameters and reading a return value For this sample I’ve decided to create an extension with a codeunit that contains the business logic that I want to call via OData. My codeunit has two business functions: CloneCustomer : it creates a new customer based on an existing customer record GetSalesAmount : it gives the total sales amount for a given customer The codeunit code is defined as follows: To use OData V4 bound actions you need to declare a function in a page and this function must have the [ServiceEnabled] attribute. For this demo project, I would like to publish the Customer Card (page 21) as OData web service, so the natural thing to do is to create a pageextension object of the Customer Card to add our [ServiceEnabled] procedure and then publishing the Customer Card as web service. If you try to do this, it will never work! If you declare a [ServiceEnabled] function in a pageextension object and you try to reach the metadata of the OData endpoint ( baseurl/ODataV4/$metadata ), you will not see the action published. To publish your action attached to the Customer entity, you need to create a new page like the following and then publishing it as web service: Here, ODataKeyFields property specify what is the field to use as key when calling the OData endpoint (I want the “No.” field of the Customer record). Inside this page, I declare two procedures to call the two methods defined above in our AL codeunit: Here: CloneCustomer is a procedure called without parameters. It takes the context of the call and calls the CloneCustomer method defined in our codeunit. GetSalesAmount is a procedure that takes a Code parameter, calls the GetSalesAmount procedure defined in our codeunit and returns the result as response. What happens with the following definitions when we publish the MyCustomerCard page as web service (here called MyCustomerCardWS )? If we reach the OData V4 metadata, we can see that now we have the actions published: Now we can try to call our bound actions via OData. As a first step, we want to call the CloneCustomer function. For this, we need to send a POST request to the following endpoint: https://yourbaseurl/ODataV4/Company('CRONUS%20IT')/MyCustomerCardWS('10000')/NAV.CloneCustomer I’m using the REST Client extension to send HTTP requests to the above endpoint. This is the request sent: and this is the result of this call: What happens on Dynamics 365 Business Central? The code in our codeunit is called and we have a Customer record created (cloned by the customer with “No.” = 10000 as the input): Our second function to call ( GetSalesAmount ) wants a Code[20] parameter as input (not needed but it’s only to show hot to pass parameters to a bound action). We need to send a POST request to the following endpoint: https://yourbaseurl/ODataV4/Company('CRONUS%20IT')/MyCustomerCardWS('10000')/NAV.GetSalesAmount by passing a JSON body with the parameters (name and value). This is the request sent: and this is the response: We have the value of the total sales amount for the given customer (retrieved by calling our codeunit method). Here there’s a point to remember, because it was for me a source of hours spent on debugging: parameter’s name to pass on the JSON object must match with the OData metadata, not with your function’s parameters. For example, if you declare the bound action as follows: where CustomerNo parameter has capital letters, the OData metadata is as follows: so the JSON must be passed accordingly (parameters names must match). Not so easy, but very powerful when you understand the logic
↧
Forum Post: unit cost adjustment in Business central on prem
Hey, is it possible to adjust Unit Cost on items, that has type of Non-inventory or service? we have a customer that cannot do it, when item ledger entry has been created.
↧
Forum Post: RE: unit cost adjustment in Business central on prem
Hi, It woiuld be nice to have some more info to try answering your question. What is "adjust unit cost" that you want to do? There are two batch jobs: "Adjust ITem Costs/Prices" and "Adjust Cost - Item Entries". Is it one of these batch jobs that you are running? What kind of item ledger entry is created? Sales? Purchase? And what is going against expectations?
↧
↧
Forum Post: inventory with lotnumbers
Hello, I have a question about using lotnumbers in inventory. Here in this exemple in the consumption journal i see LOT n L1 with a total quantity of 100 However i booked already via the consumption journal a few lines earlier with that particular Lot L1 see the item ledger in the lotnr information card i see the correct quantity for that lot is there another way to consult the stock by lot or is there an option to visualise otherwise the total quantity in the lotnr list ? Thanks.
↧
Blog Post: DevOps for AL – ALOps is ALive!
Yes! I did it! I finally managed to created the blogpost that I have been wanting to put online for quite some time. I have been putting off a lot of other posts – just to be able to make this one happen asap! You might remember the post about Directions US 2019 – that I had some work to do . Well – let’s just say I’m trying to deliver on that promise – at least for 50% for now ;-). Namely, on – what I think – the most important part: DevOps Indeed! In my opinion, everybody of us (and I really mean everybody!) will – at one point in time – have to start organizing on how to develop in team. And not only that. Moving into SaaS, your customers will have more demands, like “upgrading”: everyone will get a new version of Business Central – you better make sure that all your developments (Apps AND PerTenantExtensions) are ready for the next release at all times. This means, you’ll have to compile, test, publish, .. all your apps, in combination with a multitude on localizations, every single day … for all your solutions and customers. Yeah, I know – build pipelines, right? Exactly. And a lot more, but to solve the “let’s be prepared for the future” kind of thing, and “let’s try to work in team” – build and release-pipelines are there for you to help. Even more: in my opinion, they are indispensable to do anything with AL Development. Ok, but where is ALOps going to help us? Well, I noticed that most of us understand the concept of build pipelines, and even agree with the fact that it’s absolutely necessary for moving to SaaS (development, maintenance, …) – and even OnPrem! But .. we are AL developers, and C/AL Developers, and we’re good at it! But setting up DevOps is a lot of infrastructure, a lot of PowerShell, .. a lot of stuff we are not using in our daily life. Let alone to “maintain” in our daily life. That’s why so many people came up to me to ask “can I use yours” after my session at Directions. And THAT is exactly where ALOps will be able to help you: without having too much knowledge of infrastructure, PowerShell, Build Pipelines, … you will still be able to set up one or multiple pipelines in no time! ALOps Extension ALOps is a Azure DevOps extension that works on Azure DevOps online, but also on DevOps OnPrem. You can find it in the marketplace here: https://marketplace.visualstudio.com/items?itemName=Hodor.hodor-alops . You will have to download this to your DevOps to get the build pipelines running. Now, you will have easy access to new tasks while setting up your build pipeline. You can find a more detailed description on how to get ALOps in your DevOps here . ALOps – How does it work We are working hard on documentation. There is even a script that creates the documentation for us, straight from the tasks-manifests of the extension – during the build of our extension ;-). That kind of like means that all the important documentation of all steps is always up-to-date. Great! Let me walk you through all the information we have The marketplace Url: https://marketplace.visualstudio.com/items?itemName=Hodor.hodor-alops This is where it all starts. Here you download the DevOps extension to your DevOps environment. And all important steps are documented right on this page. It can’t get any easier than this. These “Tasks” are actually the building blocks of a build pipeline, specifically for AL Development. But … don’t worry … we will help you even more with building a pipeline. Templates URL: https://dev.azure.com/HodorNV/ALOps%20Templates/ We will do that with templates. We will have template projects for you in the public project above. Just navigate to the repos, and all repos in this project will be a certain template, containing all that is needed for an AL app, including: Recommended settings Bases folder structure App.json Recommended .gitignore A build pipeline We aim for this build pipeline to be as runnable as at all possible. So, if you would fork or clone or import from this project, you already have a running build with a minimum of effort. A short description on how to do this, you can find here . Documentation and feedback on GitHub URL: https://github.com/HodorNV/ALOps We try to document the most important things of ALOps on Github, where we also gather feedback. The repo on github is actually just a set of .md-files, which documents ALOps. It’s quite limited now, but we’ll extend this during the lifecycle of ALOps, obviously ;-). An important part if obviously your feedback. Anyone who has questions, or has feedback or anything – just use the “Issues”-section of the repo and we’ll react as fast as possible ;-). Examples Another thing to help you to get started is my repos – I have set up two public projects for you to browse through, and get ideas: WaldoGitHubBuilds : with build pipelines for my own github repos. At this point, just one: I’m building the waldo.nav.pad (from github) on my devops here . WaldoDevOpsDemos : just some DevOps repos, different scenarios, like Side-by-side (importing a fob) Dependend apps Just a base app Multiple apps in one repo (like Freddy does it) More scenarios will be online at some time in the (near) future. Price There is a big portion of ALOps that is free: simply said: it’s free for the community , but it’s not free for all projects that are for business. What does that mean? ALOps is free for Open Source projects and any public projects! So, all MVPs and other community contributors that have an Open Source AL solution (like myself) – go get busy and build away! For anything that is private (which I assume are trying to make a profit), there is a price. More information about that on the marketplace ! That’s also why I’m not ashamed at all to advertise for it .. I know that a lot of you will be able to use it for your own public, community stuff, like I do – and that means it can benefit the community a lot, like it has been benefitting myself, and also our company A LOT. Hope you like it! I’m not going to blog about every single details, but I will be working on making your life with ALOps as easy as at all possible – I promise ;-). Enjoy!
↧
Forum Post: Shortcut dimension values are sometimes not show in purchase invoice lines in NAV2016
When the user fill in a value in the shortcut dimension field 3 in a puchase invoice line, this value disappears sometimes. In the background the dimension value is still present. Sometimes after a few seconds the value is showing again. Does anyone know the cause of this?
↧
Forum Post: RE: Workflow Conditions - Where are they stored?
Thanks, I skipped right over the BLOB field because it was blank in dev environment view.
↧
↧
Forum Post: My Notification to be show only customized table/ Page in NAV 2018
Hi Team, I have been working on how to displayed customized pages in myNotification factbox but how was unable to get it. Thanks
↧
Forum Post: RE: inventory with lotnumbers
If this is the complete list of item ledger entries in the system, this is a very strange picture. Lot list shows 100 as available quantity absolutely correctly, becauase that's the remaining quantity in the single inbound entry. Consumption entries are not applied to this purchase. But they are applied to something, since there is no remaining quantity in consumptions (and consumtion cannot produce negative stock, anyway). How were these consumption entries posted?
↧
Forum Post: Problem upgrading data source in PowerBi web
Hi everybody, We have a customer with NAV installed on a server. To connect to the server via rdp, I must use a VPN. In my laptop, I've created a PowerBi report, adding 3 data sources from nav. Of course, VPN must be on, and when adding the data sources I didn't have any problem, because I've added in the credentials one of the users we use to start session in NAV. Also, when I refresh data soruces, the changes in data are shown correctly. But when I publish the report into the web version, and try to refresh data, I get a credential error. I'm trying to add into the credentials the same NAV account that i configure when adding the data sources into the desktop version, but I still get the error. Why can be this happenning? Thank you very much
↧
Forum Post: RE: My Notification to be show only customized table/ Page in NAV 2018
MyNotification is showing all job quench error but i want to specific to customize tables
↧
↧
Forum Post: RE: inventory with lotnumbers
Hello, these consumptions are posted with the function finished production orders. there for the BOM the lot is chosen regards
↧
Forum Post: barcodes in BC
Does anybody knows the difference between items "identifiers" and item cross reference "barcode" I would like to use barcode scanning in a cash & carry outlet shop where the can scan the barcode of the goods to register on a sales order.
↧
Forum Post: RE: Shortcut dimension values are sometimes not show in purchase invoice lines in NAV2016
If you update the page (F5) are the value still missing?
↧
Forum Post: RE: barcodes in BC
For a shop, it's better to use bar code cross references. Item identifiers are only required for ADCS in warehouse: https://docs.microsoft.com/en-gb/dynamics-nav-app/warehouse-use-automated-data-capture-systems-adcs This information is very specific for this functional area, while cross reference codes will be carried over to all your sales and purchase documents.
↧
↧
Forum Post: RE: inventory with lotnumbers
I'm not sure I quite understand how the entries were posted. What is "Finished production orders function"? Consumption can be flushed automatically when the order is refreshed, when it's finished, or posted manually via Production Journal from the order card, or in a separate Consumption Journal. Which method was used? How were lot numbers assigned - in journal lines, or in component lines of the production order? And there is one more thing to check. Can you open Application Worksheet, select one of these consumption entries, and look which entry it is applied to?
↧
Blog Post: Dynamics 365 Business Central and API calls limits
Is there a limit on the number of requests that an external application can perform with the Dynamics 365 Business Central APIs? I think that this is actually an undocumented response (please correct me if I’m wrong) but the answer is absolutely yes. As every well architectured solution, Dynamics 365 Business Central service limits the number of simultaneous calls in a certain sliding window. If you have an external service that performs too much requests on a tenant, you could receive an error like: { "error": { "code": "Application_TooManyRequests", "message": "Too many requests reached. Actual (101). Maximum (100)." } } This is mainly to avoid things like denial-of-service attacks (DoS) and to avoid problems like lack of resources. What you should do in your external application to avoid this? You should handle how you perform requests to a Dynamics 365 Business Central API and if you receive this error you should adopt something like a retry policy on your API calls. What are the official numbers for these limits on a SaaS environment? I will update this post when I’ll have an official response
↧
Forum Post: 2009 R2 report to BC Cloud
Hi All, If I want to convert customize 2009 R2 report to BC cloud, what steps I should follow. I just want to confirm the process so that no surprise errors in BC. Thanks in Advance.
↧