Quantcast
Channel: Dynamics 365 Business Central/NAV User Group
Viewing all 11285 articles
Browse latest View live

Forum Post: RE: NAV 2018 - Analysis By Dimension

$
0
0
Hi, Analysis views are not created automatically from g/l entries. Normally, all posting routines call codeunit 410 at the end of the posting process to update analysis views. For example, you can find this code in codeunit 5980 "Service-Post" UpdateAnalysisView.UpdateAll(0,TRUE); What you need is to call the same function in your custom procedure

Forum Post: RE: NAV 2018 - Analysis By Dimension

$
0
0
Thanks a lot Alexander. I adopted the same solution and its working fine now.

Blog Post: Dynamics 365 BC Wave 2 Release: What happens now?

$
0
0
Microsoft Dynamics 365 Business Central Wave 2 release date is just around the corner and there are many aspects of which you should be aware. I’ve written an article for my Simplanova ‘s friends that summarizes some important technical aspects to take care. You can read it at the following link: https://simplanova.com/blog/dynamics-365-bc-wave-2-release/ Just a quick summary here: AL only: the application is splitted into two main apps called Base Application and System Application . System Application is a modular extension and every module exposes a publicly accessible interface (facade), while the internal implementation of each module is hidden to the outside (public access). Schema is as follows: Microsoft will not break your existing extension. Breaking changes are listed here  (things that needs a bit of refactoring). You will find many system objects that will be deprecated in favour of the new patterns. If you have code that uses system objects, you will receive warnings: Starting from Wave 2 release (version 15), you need to refactor your code in order to use the System Application modules: More details on the Simplanova’s blog post . I recommend to give it a check

Blog Post: How Do I – Prevent an epic clusterfuck…

$
0
0
Now that the NDA on Business Central Wave II has been lifted and the DVD preview is released partners got time to look at the code Microsoft has refactored. The reactions vary from being marketing correct to more realistic . I have a strong opinion about what Microsoft did, and especially how they did it. First of all, I agree that it’s a great idea to split NAV up into modules and I also agree that the architecture has to be modernised in more than one way. But that does not mean it has to be with breaking changes and most of all, it did not have to happen in Visual Studio Code with Extensions. Microsoft is years, maybe a decade too late with starting this project. To write decoupled code you don’t need extensions and you don’t need a fancy code editor. You need discipline and consistency. Especially the latter seems to be where Microsoft is totally off these days moving away from patterns in a horrible way. (But that’s a different blog.) As I suggested in many presentations Microsoft should have added Table Extensions and Page Extensions to C/Side. They should have also added a column to the Object table called “module”. The compile should have been enhanced checking if modules would compile on their own. With these simple changes modularity would have been possible a long time ago and the ecosystem would have been used to it. Let’s not look the cow in it’s behind, let’s see how you can prevent your partners from being forced to refactor their code. The problem Microsoft has now managed to place themselves into is that extensions on AppSource cannot be compatible with both Wave I and Wave II. This means tenants cannot be upgraded until partners are ready with the refactoring, which is a lot of work. It get’s more difficult with per-tenant extensions. To upgrade the code a partner has to compile against Docker or the installed DVD, but how does the customer test against their own data? Does the customer get to upgrade a sandbox? And if yes, how many times? To me this just shows that people at Microsoft are too far away from reality and living in a dreamworld. It can be done different and Microsoft actually implemented what I am going to suggest after being pushed a bit. In the current preview the TempBlob table has it’s old functions. So does the Language table. Both are moved from BaseApp to System. The functions are marked to be removed in the future. New functions are somewhere else. In earlier previews these functions were not there as you can see in the GitHub reposititories that are public for everyone to see. By putting the functions back under a bit of partner pressure Microsoft prevented a lot of drama. Things that are still broken are primarily renamed codeunits and functions that changed signature. A simple example is the function to read the contents of a zipfile that changed from a temptable to a list of text. To prevent breaking this Microsoft’s AL team introduced overloading. This allows to create a new and improved version while keeping the old one and mark it to be obsolete in the future. The same can be done with new codeunits. Just leave the old ones there. Point them to the new code if you want to. BUT MAKE SURE EXTENSIONS CAN EASILY COMPILE AGAINST AT LEAST TO AJACENT VERSIONS!!!!! This way of moving API releated code has been normal in all frameworks for decades. Why can a huge company like Microsoft no do this with Business Central? I just cannot get my head around it. I know it’s cool to be an MVP. I’ve been an MVP for 11 years, traveled the world and it gave me opportunities I could have never dreamed about. That does not mean you cannot have your own opinion and it does not mean you always have to agree with what Microsoft does. I also understand how insanely difficult it must be for the Business Central team to survive within the large Microsoft organisation working with a small budget. There is currently no leadership to steer the ship and this may cause the situation today. It’s going to be interesting to see what happens in the future. I am in favor of continuing to break the functional app into pieces with contracts. I will explain how I would try to do this. My favorite example is Fixed Assets. Did you ever try and see what happens if you remove the 56xx objects from C/Side? Large parts of the application will no longer compile. Codeunits like 12, 80, 90 and tables like 37, 39 and 81. To prevent this you’ld have to implement event publishers and introduce enumerations. This will allow to move code that has dependencies to it’s own module. This needs to be done without changing any of the functionality and then taken into production. Only after a succesful launch without changing the functionality one can consider changes. But, the changes should then be done to a new app while leaving the old one in tact. This is probably not something you would want to do with Fixed Assets, but with Production, Warehouse Management or Inventory it makes more sense. Especially Warehousing is in a horrible state because it’s hard to extend. It was never designed for extensibility. It does not have to be when the old module can be replaced with a new module. Maybe I am just dreaming or over simplifying things but I think it’s reaslistic to say that with the introduction of the system app Microsoft could have been more careful, take more patience and allow a more phased approach. After all we are talking about a business solution that is critical to the companies using it. Microsoft made a strong promise about upgradability that can and should be kept. Partners have the responsibility to be more critical to their software vendor. In my opinion a lot of unnessesairy *** is taken for granted just because a logo with 4 squares has been stamped on it. Just my 0.02$.

Forum Post: Getting error Operation Aborted while send Json File by PUT Httprequest

$
0
0
Hi all, I am sending Json file through httprequest by PUT Method in NAV 2017 but I am getting error Operation aborted while XMLHTTP.send(Msg);. Below HttpRequest Headers: CLEAR(XMLHTTP); CREATE(XMLHTTP,FALSE,TRUE); XMLHTTP.open('PUT',URL,FALSE); XMLHTTP.setRequestHeader('Content-Type: ','application/json'); XMLHTTP.setRequestHeader('Authorization: ',Store."Token Type"+' '+AccessToken); XMLHTTP.setRequestHeader('scope:',Store.Scope); XMLHTTP.setRequestHeader('client_secret:',RestaurentIntegrationSetup.client_secret); XMLHTTP.setRequestHeader('client_id: ',RestaurentIntegrationSetup.client_id); { Body } XMLHTTP.send(Msg);

Forum Post: How to display JPY Currency Factor as 106.55, not 0.00938

$
0
0
Hi, This is a question about exchange rate setup in a NAV installation for Japan. (Here my LCY = Japanese yen JPY). Let's suppose the current USDJPY rate is 1 USD = 106.55 JPY. I can enter this into my USD exchange rate window in two ways: Currency Code = USD Relational Currency Code = '' (LCY) Exchange Rate Amount = 1 Relational Exch Rate Amount = 106.55 OR Exchange Rate Amount = 0.009385 Relational Exch Rate Amount = 1 In both cases, the Currency Factor field on a sales quote shows as 0.009385 (which is correct: in my first example it is 1 / 106.55 and in my second example it is 0.009385 / 1) The Currency Factor field is an internal field used throughout NAV and I understand it not really for users to look at. In my situation, the Japanese want to see (on sales quote lines, printed documents etc. the rate of 106.55). Is there a way to make the Currency Factor field be 106.55, or is a new development required with a new field? I think a new development is needed but wanted to check your collective opinion Many thanks, Chris

Forum Post: RE: Changing the value of a Text Constant with extensions

$
0
0
You cannot change Text Constant, but perhaps the is a eventsubscriber you can hook into? Try running the function with the event record enabled

Forum Post: RE: How to display JPY Currency Factor as 106.55, not 0.00938

$
0
0
Hi Chris, You are right, collective opinion will support you in this. Field "Currency Factor" always contains the exchange rate from the local currency (JPY) to the document currency (USD), there is no setup option to show it the other way round. I wouldn't probably extend the table with another field for this purpose - would rather opt for a function to revert the value "on the fly". But this depends on the situation, of course.

Forum Post: What is the best way to break into NAV?

$
0
0
I am an experienced developer, professional level SQL Server, QlikView, very experienced with VB. I want to break into NAV. There is so much to it, how did others first get into it? I have found basic projects so I now have 6 months of experience. How did others do it?

Blog Post: The “SystemId” in Microsoft Dynamics 365 Business Central 2019 release Wave 2

$
0
0
I’m returning after a very interesting workshop on DevOps in Berlin .. . At this moment, I’m wasting my time in the very “attractive” TXL airport because of my delayed flight. And how can I better waste my time than to figure out some stuff regarding Business Central. Figuring out indeed, because I have barely internet, a crappy seat, nearly no access to food, … so for me this is a matter of bury myself so I don’t have to pay attention to my surroundings ;-). Anyway .. It’s not all that bad .. but a delayed flight is never nice. Anyway….. opic of today: the new systemId ! While I was converting our app to be able to publish on the Wave 2 release .. this was something that I noticed: All the “integration id’s” are marked for removal, and – most interesting – will be replaced by “the systemID”. What is that? Google, Microsoft Docs .. none of my conventional resources helped me finding out what SystemID is .. luckily, I did come across some information on yammer by Microsoft ;-).. RecordIDs You probably all know RecordIDs, right? A single “value” that referred to a specific record in a specific table. We all used them in generic scenarios, right? Also Microsoft – I don’t know if you know “Record Links”? A system table that stores notes and links to specific records? Well, the link to the record is made through RecordID. We have been using it for years .. . Now, a big downside of using RecordIds was the fact when you would rename the record (one of the fields of the keys), it would change its RecordId as well .. and all of a sudden, you could lose the connection in all tables where you stored that specific ID. Long story short – not ideal for integration or generic scenarios… Surrogate Keys And this is where “surrogate keys” of my good friend Soren Klemmensen came into place. He came up with a design pattern (well, I don’t know if he came up with it – but he sure advocated it for a long time) that described how to implement having a dedicated unique key of one field for a record. Basically: add a field in a table, and make sure it has a unique GUID. Make it that all these surrogate keys have the same FieldNo, and you are able to generically access the value of any of the keys for any record. This is something Microsoft actually implemented themselves. And the code is all over the place. Even still in Wave2, we have the code to fill the “Integration Ids” as they call it. Nice system, but a lot of plumbing needed to make it work. I don’t know if there was a design pattern that described what you needed to do to apply this on your own tables – I never did ;-). But definitely interesting to do for many scenarios. Thing is .. quite a lot of work. The SystemID Now, as you got for the first screenshot: Microsoft is abandoning this field 8000 (that so-called “integration id”) – their first implementation of the surrogate keys – and will implement “SystemId” from the platform. Meaning: whatever you do: you will ALWAYS have a key called “systemId” for your table, which is a unique GUID in that table that can identify your record, and will never be changed – even when you would rename your record. How cool is that! Here is an example of a totally useless table I created to show you that I have the systemId in intellisense: What can we expect from the systemId? Well, in my understanding – and quite literally what I got from Microsoft (thanks, Nikola ): It exists on every record But not on virtual/system tables (not yet, at least) You can even set it in rare scenarios where you want to have the same value (e.g. copy from one table to another, upgrade…). Simply assign System Id to the record and do Insert(true,true) – 2x true There is a new keyword – GetBySystemId to fetch by system id It is unique per table , not per DB. Customers and items may have same IDs, though is hard if you are not manipulating it yourself, since guids are unique. Let’s say, they are “probably” unique – but on SQL, there is a unique key defined on the field, so only guaranteed per table. Integration Record is still there, however the Id of the Integration Record matches the SystemId of the main record (Microsoft has code and upgrade in place) You can only have simple APIs on it (no nesting, like lines). At this point, at least. It should be fixed soon, which is why the APIs are not refactored yet to use SystemId instead of Id. A few more remarks IF you would create a field that refers to a systemId, then it makes sense to use the DataClassification “SystemMetadata” for it. Not because I say .. just but because I noticed Microsoft does ;-). Another not unimportant something I noticed: this is a system-generated field. So if you would need the fieldnumber, you have “recref.SystemIdNo”: My take on it From what I understood: there is work to do, but things are looking good:-). In fact, it is exactly what we have been asking for – and Microsoft delivers. Again! Great! I know this will see a lot of use in the (near) future! Within the Base Application, and in lots of apps. Do know, I didn’t have any documentation about this – so all is based on some small piece of remark on yammer, and things I could see in code… So – if you have anything to add – please don’t hold back ;-). That’s why I have a comment section ;-).

Forum Post: RE: What is the best way to break into NAV?

$
0
0
Exactly what do you mean by break into NAV ? Learning NAV?

Forum Post: RE: Changing the value of a Text Constant with extensions

$
0
0
Thanks for your answer. There's no any event. I've created an event request in github, let's see if the problem can be fixed...

Blog Post: Creating an Azure SQL Database backup via Powershell

$
0
0
I have several Azure SQL databases (mainly Microsoft Dynamics NAV databases) on different Azure subscriptions and often I need to download a backup for some of them. Yesterday I’ve decided to automate this process by using Powershell and the Azure REST APIs (in particular the Database – Export API ). I want to have a Powershell scripts that connect to an Azure SQL Database, creates a backup (.bacpac) and download it on an Azure Storage account (Blob container). I’ve to admit that I was thinking that this task was quite easy to do, but instead I’ve spent few hours on this. The tricky part is that you need first to create an Azure AD application and then you need to use the context of this application to call the Azure REST API. Most Azure services (such as  Azure Resource Manager providers  and the classic deployment model) require your client code to authenticate with valid credentials before you can call the service’s API. Authentication is coordinated between the various actors by Azure AD, and provides your client with an  access token  as proof of the authentication. The token is then sent to the Azure service in the HTTP Authorization header of subsequent REST API requests. For creating an Azure AD application from Powershell, you need to select an app name (it must be unique in your Azure AD), provide an URI (it can be a fantasy URI) and a password for creating the application. The command to execute is the following: $appName = "YourApp" $uri = "http://yourapp" $secret = ConvertTo-SecureString "YourAppPassword" -AsPlainText -Force $azureADApplication = New-AzureRmADApplication -DisplayName $appName -HomePage $Uri -IdentifierUris $Uri -Password $secret Then you need to create an Azure Service Principal (an identity created for use with applications) and assign the Contributor role to this service principal. The Powershell commands are as follows: $svcprincipal = New-AzureRmADServicePrincipal -ApplicationId $azureAdApplication.ApplicationId $roleassignment = New-AzureRmRoleAssignment -RoleDefinitionName Contributor -ServicePrincipalName $azureAdApplication.ApplicationId.Guid If the Azure AD application is successfully created, you need to retrieve the Tenant ID and the Application ID : Write-Output "Tenant ID:" (Get-AzureRmContext).Tenant.TenantId Write-Output "Application ID:" $azureAdApplication.ApplicationId.Guid You will use these values in the next steps. To create the Azure SQL Backup process, you need to define some variables like the name of the Resource Group of your Azure SQL database, server name, database name, login and password for accessing your database: $resourceGroup = "YourDatabaseResourceGroup" $server = "YourServer" $database = "YourDatabase" $sqlAdminLogin = "AdminUsername" $sqlPassword = "AdminPassword" $bacpacFilename = $database + (Get-Date).ToString("yyyy-MM-dd-HH-mm") + ".bacpac" From your Azure Storage account, you need to retrieve the URI and the Access Key (available from the Azure Portal by selecting your storage account and clicking on the Access Keys panel): $baseStorageUri = "https://YourStorageAccountName.blob.core.windows.net/YourBlobContainerName/" $storageUri = $baseStorageUri + $bacpacFilename $storageKey= "YourStorageAccountKey" Now you need to use the Tenant ID (here $tenantId , Application ID (here $applicationId ) and secret key (here $secretkey ) of your Azure AD application previously created for authenticating and acquiring an authentication token: $authUrl = "https://login.windows.net/${tenantId}" $authContext = [Microsoft.IdentityModel.Clients.ActiveDirectory.AuthenticationContext]$authUrl $cred = New-Object Microsoft.IdentityModel.Clients.ActiveDirectory.ClientCredential $applicationId,$secretkey $authresult = $authContext.AcquireToken("https://management.core.windows.net/",$cred) Now you need to fill in the request header for the Azure API by passing the Authentication token and then send a REST request with a JSON body like described in the Database Export API reference: $authHeader = @{ 'Content-Type'='application/json' 'Authorization'=$authresult.CreateAuthorizationHeader() } $body = @{storageKeyType = 'StorageAccessKey'; ` storageKey=$storageKey; ` storageUri=$storageUri;` administratorLogin=$sqlAdminLogin; ` administratorLoginPassword=$sqlPassword;` authenticationType='SQL'` } | ConvertTo-Json and then you can send the POST http request to the API endpoint: $apiURI = "https://management.azure.com/subscriptions/$subscriptionId/resourceGroups/$resourceGroup/providers/Microsoft.Sql/servers/$server/databases/$database/export?api-version=2014-04-01" $result = Invoke-RestMethod -Uri $apiURI -Method POST -Headers $authHeader -Body $body Write-Output $result If all is ok, the RESP API is called and the backup of your database is executed: I you go to your Azure Blob Storage account, you can see (after few minutes, it does not appear immediately) that the .bacpac file is here ready for you: P.S. remember to set a firewall rule in your Azure SQL database in order to be accessible from your local client IP. In case you need them a day, the complete Powershell code of these two scripts is available here .

Forum Post: Issue with setVisible in nav BC cloud

$
0
0
Hi everyone, I'm trying to change the role center in the cloud versiion of BC. I need to hide some actions fromthe navigation menu. I've created this code: I've also tried to change the visibility to TRUE with the "payments" group, but nothing... Am I doing somthing wrong? Thank you

Forum Post: Reimplementation from NAV 2009 R2 classic to Dynamics 365 Business Central Question?

$
0
0
Hi Folks - One of our customer who are on NAV 2009R2 classic client intend to do a re-implementation in Dynamics 365 Business central. Their NAV 2009 R2 database is heavily customized with even NAV base code being modified or commented out to meet their business requirement. In 365 BC on premise I am aware that you can still modify base Nav code but MS will take away that in coming months and future upgrades will be a huge issue. Having said that my question is : 1. With the base Nav code heavily modified in customer database, In Business central can they get around standard NAV base code to get the same functionality what they have currently in Nav 2009 or do they have to change their business process? 2. Can Extensions be used with such heavily customized database? If not what are the other alternatives that are possible? 3. Will data migration be a issue from NAV 2009 to 365 BC? Any help is highly appreciated. Thanks S.

Blog Post: Ways You Are Sabotaging Your ERP Implementation. Part 1

$
0
0
Setting Unrealistic Live Dates Overview Typically in an implementation, there will be a lot of moving parts. A lot of things has to go right for a successful implementation. If any of the key tasks don’t go right, no matter how small the task is, will cause havoc or delays for a company trying to go live. I can assure you that every Dynamics 365 Business Central (aka Dynamics NAV) consultant/company will tell you that they’re an expert at Dynamics 365 Business Central / Dynamics NAV. Whether that’s true or not and how to detect if they’re full of hot air is probably a subject for another article. Even if you have the most qualified and reliable NAV partner, projects may still go wrong because of the decisions made by the company that’s implementing the software. I Want It Now It’s unfortunate (or fortunate) that we live in a society where instant gratification is the norm. You want something? Order it on Amazon in the morning and get it delivered in the afternoon. These type of service puts a lot of burden on the supplier on making sure everything goes right. When the owners of a company is often under pressure to make the necessary changes so they can meet the demands (realistic or not) of their customers, they often want to see the same turnaround time (realistic or not) from the projects they initiate. Setting Unrealistic Live Dates For some managers, the idea seem to be to set a high expectation for the project. Even if the project doesn’t get to the expectation, at least we’ll be better than the original expectations should be. One of these types of decision is deciding on a live date. In an effort to make people place a sense of urgency on the implementation, management will set an unrealistic (or optimistic) go live date. The employees or consultants will often be too polite, shy, scared, etc to call out this decision. As I mentioned earlier on this article, going live requires a lot of precise tasks to be completed, we can hurry those tasks or skip them. But shortcuts will often come back and haunt you. This is especially true in an ERP software implementation. What always ends up happening is one of the following: The company goes live without being ready The live date gets postponed Of the 2 scenarios that can happen, if #1 happens, the implementation will always lead to failure. From my personal experience, rescuing customers that went live before they’re ready never really recovers. We end up having to re-implement them to get the company back on track. Hopefully, the management has the courage to decide on option #2 and call a stop and re-evaluate. In either of those 2 cases, the damage would’ve already been done. Culture of Expecting Failure Usually, when a company misses their first go live date, they will miss their subsequent go-live dates as well. Why? Because the people are already used to failure. Their consultants or the management has promised them that they’re going to go-live, you can almost hear the employees say “Yeah… Right…”. When I walk into companies that has missed their go-live date a couple of time, it’s almost like walking into a vacuum of demotivation. When you even talk about going live again, you’re just met with an overload of cynicism and doubt. Prevention Set realistic live dates and stick to it. It’s your responsibility to call out BS if someone tries to talk crazy about a live date. How we typically plan out a customer go-live is to pick a date the customer would like to go live, then work backwards to see if that’s feasible; considering holidays, vacations, buffer time, etc. If it’s not feasible, we tell them right away, even if we get scolded at. Of you’re in charge of the implementation, be ready to say no to unrealistic requests to go live or hitting certain milestones; even in front of a team of management. Yes, they will question your expertise, your resources, your abilities, even your character. Just remember hurting one person’s feeling is better than having the whole company suffer.

Blog Post: What if…

$
0
0
Sometimes I can be a bit emotional when it comes to changes in the software product I work with on a daily bases. An example of that was my previous blog that I took offline in order to do some editing making it less about emotion and more about facts. The emotion is probably justified for a few reasons of which most the fact that Navision, NAV, Business Central (I stopped caring about the name) provides a living for me and my family but it’s more than that. Around our product there is a community that stands out from almost any other community I’ve seen. We have a large number of events that are not organised by Microsoft but by partners, customers or in one case even by one single guy. Not because they make money doing it, but because they think it’s nessesairy. What I’ve seen is that even if the product changes and evolves in complexity the people around it don’t stop loving it. Sometimes Microsoft makes a decision around the product that could have been done differently but then we always have the community steering them back. It has happend so many times that I lost count. When I read the comments to my blog it looks as if a majority agrees that working in C/Side is faster than working in Visual Studio Code and we lost a great deal of simplicity. I know that there are also many who disagree with this statement. One community that I’ve experienced to be even more passionate about their product is the Great Plains community. When I statet that in my opinion Business Central is replacing the GP product I did not got love and hugs. The fact is though that I’m a bit jealous of the Great Plains people and sometimes I wished that instead of NAV, GP was the platform of choice to move to the cloud. Why, you might think. Wel, because Microsoft is still maintaining GP, adding new features and keeping it compatible with modern versions of Windows. But the software is not overhauled like NAV. Imagine that for the next decade or more we could work wiith C/Side and sell new licenses. Including to be able to use the windows client. Many would love that, The fact that Business Central is based on NAV makes it easy for me to join the new community but it has cost us a great level of productivity, It’s going to be very interesting to see what Microsoft will do after Wave II. I cannot wait to go to Vienna and see the roadmap. Personally I think we can all use some slowing down after all the changes in the last few years. If the Microsoft slide for Wave III said “stabilise the product” I would stand up and cheer. From a business value perspective integration scenario’s are the most important area for the future. Where Visual Studio Code and AL get all the attention I would spend my time learning the API and if I had a vertical solution I would redevelop it on another platform than AL. The future of Business Central is international. BC is the only flexible SMB solution with localizations and translations all over the globe.

Blog Post: Asynchronous programming in D365 Business Central 2019 wave 2: Page Background Tasks

$
0
0
In a lot of other programming languages, the concepts of asynchronous calls and background tasks are well established and supported 1 . The idea is to have the ability to kick of something in the background without having to wait for the result and thereby blocking the user. Instead the user can just... Read the full text.

Forum Post: SETRANGE not applicable on value red|blue

$
0
0
Hi all, i want add restriction on posting of item journal as per location for that i have done following implementation: usersetup table:91 created field :loc. area code 200 run table adding value to this field as red|blue|green on item jounal onaction of post addid code as: UserSetup.RESET; UserSetup.SETRANGE("Loc area",Rec."Location Code"); IF UserSetup.FIND('-') THEN BEGIN CODEUNIT.RUN(CODEUNIT::"Item Jnl.-Post",Rec); CurrentJnlBatchName := GETRANGEMAX("Journal Batch Name"); CurrPage.UPDATE(FALSE); END ELSE MESSAGE('USER NOT VALID FOR THIS LOCATION: %1',"Location Code"); now it not allowing to post user with location code as blue,and red on item journal it giviong me error message why so.

Forum Post: RE: How to display JPY Currency Factor as 106.55, not 0.00938

$
0
0
Thank you Alex for your answer - I agree with you
Viewing all 11285 articles
Browse latest View live