Hold on! That's a totally different message, with the message payload encoded in a different way - it's not actually json if you are sending it as a url-encoded string. So, what we are talking about here is not a problem of BC sending incorrect data in the request - it's rather about agreeing on the message format between BC and the server. I mentioned Laravel above, because I see from the error log that the server receiving messages runs a php application based on this framework. One thing that Laravel does - it reads the content-type header, and if it contains the line '/json', the message body is decoded by json_decode. Otherwise, content is passed on as is, without decoding. Message is sent by the controller dispatcher to a particular implementation of the controller, which is declared as an abstract class in the framework. This is your ContactController, which is supposed to create contacts from the message content. Laravel and its HTTP base layer Symfony are common open-source frameworks, but I have no idea what is happening inside the custom controller extending the framework The solution to this problem should be to clarify the exact message format that the server is supposed to receive, or at least, compare the message (including all the headers) sent from NAV 2018 and mimic it in the new extension - Request Bin could help here.
↧
Forum Post: RE: Extension for send a jSon to a webService
↧
Forum Post: RE: Extension for send a jSon to a webService
Hi Alexander, Thank you as always. We've done different tests, and finally the service provider made a small change into the webService, and the comunication is working. Now I have another problem when that service calls to my published WebService, but hat's another stuff, probably for another different post... THnak you again, really appreciate your time and tips!!!
↧
↧
Forum Post: Consume a webService published in BC cloud sandbox
Hi everyone, As you can deduce from my latest post, I'm making developments for integrate BC with external services. I am creating a function in a codeunit that should return a text variable to an external service that calls it. I'm testing it on postman, but I'm nto able to connect it: As you can see, I'm getting a 500 internal error. The user and password I'm using to authenticate are the ones I use to log into BC. Tee codeunit is published as a WebService, as I've always done: May be this can be a permission issue or something like that?
↧
Forum Post: The server did not provide a meaningful reply; this might be caused by a contract mismatch, a premature session shutdown or an internal server error
Hi All, one of my user hit this error. sometimes morning can access, after that then will hit this error. i've checked event viewer, there is nothing. any advise? Thanks
↧
Forum Post: RE: Consume a webService published in BC cloud sandbox
I'm quite sure that this should be an authentication isssue. The user in BC is this one: Which should be the credentials?? Or may be this mark, that I0m not able to mark it to true...
↧
↧
Blog Post: Dynamics 365 Business Central: TRANSFERFIELDS and Obsolete fields
Do you know the wonderful C/AL (ops, now AL) command called TRANSFERFIELDS ? This command permits you to copy all matching fields in one record to another record: Record.TRANSFERFIELDS(FromRecord [, InitPrimaryKeyFields]) TRANSFERFIELDS copies fields based on the Field No. Property of the fields. For each field in Record (the destination), the contents of the field that has the same Field No. in FromRecord (the source) will be copied, if such a field exists. The fields must have the same data type for the copying to succeed (text and code are convertible, other types are not.) There must be room for the actual length of the contents of the field to be copied in the field to which it is to be copied. If any one of these conditions are not fulfilled, a run-time error will occur. TRANSFERFIELDS is widely used in Microsoft’s Base App code (posting routines and so on) but unfortunately at the moment there’s a problem with this command on Dynamics 365 Business Central: it ignores the ObsoleteState property. As an example, imagine to have a SOURCE table with the following fields: Field ID Field Name Field Type ObsoleteState 1 Field1 Code[20] 2 Field2 Text[100] 3 Field3 Integer Removed 4 Field4 Decimal Here, Field3 was declared with ObsoleteState = Removed (this field will never be used). Now, consider a DESTINATION table with the following fields: Field ID Field Name Field Type ObsoleteState 1 Field1 Code[20] 2 Field2 Text[100] 3 Field3 Code[20] 4 Field4 Decimal If now in your AL code you execute DESTINATION.TransferFields(SOURCE) you receive an error at runtime (like “ the following fields must have the same type “), because the TRANSFERFIELDS command tries also to transfer Field3 from SOURCE to DESTINATION tables (despite the ObsoleteState property) and data type doesn’t match. There’s also an issue opened on GitHub lots of time ago about this, but no news from Microsoft at the moment. How to avoid this? Quite difficult (alias impossible) on Microsoft’ Base App code (you cannot modify that code). For your solutions (extensions), you should implement a “safe TRANSFERFIELDS” command that consider also the ObsoleteState field property. Obviously, also Microsoft should do that on its standard codebase. Here a possible solution (Microsoft, please check/think on this) of a “safe” TRANSFERFIELDS that: Transfers only fields where ObsoleteState is not set as Removed . Checks the matching data type between source and destination fields (for not throwing errors) procedure SafeTransferFields(SourceTableID: Integer;TargetTableID: Integer); var SourceRef: RecordRef; TargetRef: RecordRef; FldRef: FieldRef; FieldsSource: Record Field; FieldsTarget: Record Field; FieldsNoToTransfer: Record Integer temporary; begin FieldsSource.SetRange(TableNo,SourceTableID); FieldsSource.SetRange(Class,FieldsSource.Class::Normal); FieldsSource.SetRange(Enabled,true); FieldsSource.SetFilter(ObsoleteState,'<>%1',FieldsSource.ObsoleteState::Removed); IF FieldsSource.FindSet() then repeat //Check if the field exists in the destination table and if the criteria for the trasfer are satisfied if FieldsTarget.GET(TargetTableID,FieldsSource."No.") then if (FieldsTarget.Class = FieldsSource.Class) and (FieldsTarget.Type = FieldsSource.Type) and (FieldsTarget.ObsoleteState <> FieldsTarget.ObsoleteState::Removed) then begin //This field must be transferred FieldsNoToTransfer.Number := FieldsSource."No."; FieldsNoToTransfer.Insert(); end; until FieldsSource.Next() = 0; if FieldsNoToTransfer.IsEmpty then exit; //There are no fields to transfer //Execute the transferfields of the selected fields SourceRef.Open(SourceTableID); TargetRef.Open(TargetTableID); if SourceRef.FindSet() THEN repeat FieldsNoToTransfer.FindSet(); repeat FldRef := TargetRef.Field(FieldsNoToTransfer.Number); FldRef.Value := SourceRef.Field(FieldsNoToTransfer.Number).Value; until FieldsNoToTransfer.Next() = 0; TargetRef.Insert(); until SourceRef.Next() = 0; end; Basically, the command checks all the fields to transfer, saves them in a temporary Integer table and then performs the transfer of these fields by using RecordRef and FieldRef objects. You can use this “safe TRANSFERFIELDS” in your extensions in order to avoid errors. As said before, Microsoft should do something too…
↧
Blog Post: Business Central & Nintendo
WARNING!! Personal opinion here! Inspiration to write down idea’s are everywhere. Next week my youngest son has his birthday and we went out this evening with the train to the big city of Deventer to buy him (us) a Nintendo Switch. I’ve been loyal to Nintendo since the 1980ies and bought many of their consoles. I’m also loyal to Mario and have most of the games. When the Nintendo WII came out there was the option of buying NES games for 5 dollar and I bought Mario I and III. Again, because I already purchased them 20 or so years earlier. Now what does this have to do with Business Central? It made me reflect to a talk I had a few hours earlier while driving back home in the car with some former collegueas. They wanted to pick my brains about upgrading customers from old Navision to Business Central. In this case one was on 3.70 and the other on 2009 classic. With Navision upgrading was easy. It required common sense and discipline. Two qualities every Navision developer should have. Back in the days I did many upgrades fixed price for less than 5k. 60% of all Navision installations are on classic, or that is what my former collegues told me. The hold of upgrading because of the gap with RTC and something called a financial crisis that forces many people from sitting on their (flat) wallet for half a decade or more. They wanted to know how to analyse if the customizations can be ported to Business Central. The idea was to install Business Central On Prem and keep running on the same version again for a decade or more. I told them that this was a horrible idea and it should not be advised to customers. Nobody should want to run Business Central On Prem with a support window of only 6 monhts. That’s right. Business Central only get’s cumulative updates for 6 months. After that you are on your own and if you want to do stuff like backporting a fix from a higher version it means making your own base app. That’s horrible. Business Central on AL Only has just officially become a cloud only solution because no SMB is qualified to install and maintain it on premises and upgrade every 6 months. Because the Extension model in Business Central with AL is based on taking a dependency on metadata from Microsoft it’s very fragile for changes done my Microsoft. A lot of things that are technically possible should be avoided with per-tenant extensions because you will be forced to refactor your changes every 6 months. Very, very expensive. Business Central is a high volume product that should be perrsonalised with apps from AppSource. An AppSource app is only interesting from an economics perspective with one hundred paying customers or more as it requires a dedicated team focussing on high quality, automated testing and an intuitive user interface. The partners who do not accept this and keep modifying Business Central with complex per-tenant extensions are a danger for our ecosystem. If you require a complex module for your company, use a different platform like power platform or other Microsoft options. As long as you stick to Azure, MSFT does not care. The days of easy upgrades with Navision are over. Welcome to the days of Nintendo where we have to constantly buy new consoles to use the new toys.
↧
Forum Post: RE: Shortcut dimension will not export in excel.
Yes, because it's not stored in GLE table. You need to write a query/report or customize to get it.
↧
Forum Post: Ship from multiple sales orders, only invoice one freight line
Currently, we may have multiple sales orders for a customer which may be shipped on the same day, each order has freight lines on. We want some way so that we can combine the shipments and only charge on a freight line. Is there a solution in NAV?
↧
↧
Forum Post: RE: Your extension is incompatible with an upcoming release of Microsoft Dynamics 365 Business Central
You should use next major version while uploading your new extension so that Microsoft will install it when they upgrade the tenant to BC wave 2 automatically.
↧
Forum Post: RE: Your extension is incompatible with an upcoming release of Microsoft Dynamics 365 Business Central
Thanks MOhana. Tha't what I've done. Use next major update, and wait until the automatic upgrade.
↧
Forum Post: Unable to download symbol in BC150
Hi Everyone, I have installed BC wave2 release and try to download the symbol before publishing the application then it is throwing an error message (below). So my question is that did I missed something or the version of application is not correct? Can anyone let me know how can I fix this? Below I have added my app and launch json files. Thank you. Error [2019-10-21 12:17:58.06] The request for path /BC150/dev/packages?publisher=Microsoft&appName=Application&versionText=14.0.0.0 failed with code NotFound. Reason: No published package matches the provided arguments. [2019-10-21 12:17:58.06] Could not download reference symbols. Please ensure that: 1. The correct server name and instance are specified in the launch.json file. 2. The correct application version is specified in the app.json file. 3. The dependencies are correctly specified in the app.json file. Microsoft (R) AL Compiler version 4.0.2.51497 App.Json "id": "bfd0674d-8d01-4016-ac87-d82d6361a865", "name": "ALProject3", "publisher": "Default publisher", "version": "1.0.0.0", "brief": "", "description": "", "privacyStatement": "", "EULA": "", "help": "", "url": "", "logo": "", "dependencies": [], "screenshots": [], "platform": "14.0.0.0", "application": "14.0.0.0", "idRanges": [ { "from": 50100, "to": 50149 } ], "contextSensitiveHelpUrl": " ">https://ALProject3.com/help/", "showMyCode": true, "runtime": "3.2" } Launch.json { "type": "al", "request": "launch", "name": "Your own server", "server": "">http://localhost", "serverInstance": "BC150", "authentication": "Windows", "startupObjectId": 22, "startupObjectType": "Page", "breakOnError": true, "launchBrowser": true, "enableLongRunningSqlStatements": true, "enableSqlInformationDebugger": true, "port": 7049 } ] }
↧
Forum Post: RE: Unable to download symbol in BC150
I have changed the "runtime": "4.0" but still the same error.
↧
↧
Forum Post: RE: Unable to download symbol in BC150
Try this: { "id": "xxxxxx-5a3f-4706-a58e-xxxxxx", "name": "STD", "publisher": "Developer", "version": "1.0.0.1", "brief": "", "description": "", "privacyStatement": "", "EULA": "", "help": "", "url": "", "logo": "", "dependencies": [ { "appId": "xxxxxxxxxxxx-4f03-4f2b-a480-xxxxxxxxx", "publisher": "Microsoft", "name": "System Application", "version": "1.0.0.0" }, { "appId": "xxxxxxxxxxx-84ff-417a-965d-xxxxxxxxx", "publisher": "Microsoft", "name": "Base Application", "version": "15.0.0.0" } ], "screenshots": [], "platform": "15.0.0.0", "idRanges": [ { "from": 50000, "to": 50049 } ], "contextSensitiveHelpUrl": "https://STD.com/help/", "showMyCode": true, "runtime": "4.0" }
↧
Forum Post: Getting error while selecting Query in BC onprem Web Services
Hi All, I have created a query-50000 - Sales Inv Hdr using table - Sales Invoice Header. I tried with different query ID but same error. But when trying to select it in web service page its giving error. One more query i created Sales Inv Line - 50001 using table Sales Invoice Line, able to select it. What and where I am missing? Thanks in advance.
↧
Forum Post: Can we create Custom Microsoft flows in Business central?
Hi All, Version : Business Central Wave 2 I am creating custom workflow Using Microsoft Flows, But giving error shared in attachment. Can we create custom triggers & Flows in Business Central.
↧
Forum Post: Dynamics Nav 365 integration with Dynamics 365 Sales
Hello, I'm working on an integration between Dynamics NAV 365 and Dynamics 365 for sales, I'm trying to connect using Microsoft given link ("https://*.crm4.dynamics.com") and whenever I test the connection or enable the connection I get the following error ( Microsoft Dynamics 365 Business Central The connection setup cannot be validated. Verify the settings and try again. Detailed error description: The system is unable to connect to Microsoft Dynamics CRM. Detailed description: Unable to Login to Dynamics CRM. ) Any help would be appretiated! Thanks in advance.
↧
↧
Forum Post: RE: Table Relation with multiple filters
Thanks again Alexander, I swear I tried that originally. Do you know of any Business Central tables that are used for the purposes of storing filter data like that? Or should I just create my own. Business Central's "pay per table model" sure forces me to think different when it comes to data storage. Thanks Yann
↧
Forum Post: Publish a CodeUnit as REST
Hi everyone, A simple question. Is it possible to publish a CodeUnit like a REST api into NAV 2018? Or just liek SOAP? Thank you!
↧
Forum Post: Update Factbox when changing record via Next or Previous Button
Hi, I've created a JavaScript ControlAddin for my Customer Card Page which loads Customer Related data. The Addin is subscribed to the OnAfterGetRecord which works great when you open the customer card. Unfortunately, when I hit the NEXT - PREVIOUS button on the page, the data in the FactBox remains linked to the original customer. If I close the page and open the next customer, the data loads properly. I've also tested in the Customer List page and when I change customer the data does not update there either. If I debug my code, the OnAfterGetRecord fires the first time the addin is loaded but will not fire again when I click to view the next customer. Any thoughts? Thanks Yann
↧