Hi All, I'm not able to download symbols in Al code, it is showing below error, I'm using D365BC Wave 2 on local machine. below is the launch file and error. Note : D65BC web client is working fine. { "version" : "0.2.0" , "configurations" : [ { "type" : "al" , "request" : "launch" , "name" : "Your own server" , "port" : 7049 , "server" : "http://localhost" , "serverInstance" : "BC150" , "authentication" : "Windows" , "startupObjectId" : 22 , "startupObjectType" : "Page" , "breakOnError" : true , "launchBrowser" : true , "enableLongRunningSqlStatements" : true , "enableSqlInformationDebugger" : true } ] } Error: Could not connect to the server. Please verify that: 1. The server is started. 2. The development endpoint is enabled (DeveloperServicesEnabled server setting). 3. The service name matches the one specified in the URL. 4. The port number matches the one specified in the URL (DeveloperServicesPort server setting). 5. The protocol matches the one specified in the URL (DeveloperServicesSSLEnabled server setting), and that a trusted certificate is used in case of an SSL connection. [2019-11-16 20:53:17.27] Error: Could not connect to the server. Please verify that: 1. The server is started. 2. The development endpoint is enabled (DeveloperServicesEnabled server setting). 3. The service name matches the one specified in the URL. 4. The port number matches the one specified in the URL (DeveloperServicesPort server setting). 5. The protocol matches the one specified in the URL (DeveloperServicesSSLEnabled server setting), and that a trusted certificate is used in case of an SSL connection. [2019-11-16 20:53:17.27] Error: Could not connect to the server. Please verify that: 1. The server is started. 2. The development endpoint is enabled (DeveloperServicesEnabled server setting). 3. The service name matches the one specified in the URL. 4. The port number matches the one specified in the URL (DeveloperServicesPort server setting). 5. The protocol matches the one specified in the URL (DeveloperServicesSSLEnabled server setting), and that a trusted certificate is used in case of an SSL connection.
↧
Forum Post: Unable to D365BC server from Vscode editor
↧
Forum Post: RE: Unable to D365BC server from Vscode editor
solved by 'Enable Developer Service Endpoint" 1. Open BC administration shell, 2. select your instance 3. Go to development TAB 4. Enable Enable Developer Service Endpoint
↧
↧
Blog Post: #BCALHelp
It’s the week of NAVTechDays, the biggest Business Central Community event of the year and I thought it would be good to spend a few moments on the state of our community. Business Central is taking off. According to Microsoft there are more than 4000 paying tenants and the average number of users per tentant is 10+ which is the sweetspot where Navision used to be strong. I can also see it in my daily work, especially at ForNAV, where almost all of the support cases and pre-sales activities now involve onboarding new Business Central Partners. Especially from North America the uptake from Great Plains partners is great and it’s nice to see the entousiasm. I’ve said it before and I’ll say it again. Business Central is by far the best ERP in the cloud and with no doubt the most flexible and easy to enhance. If you just forget for a moment that it used to be Navision and forget about the failed marketing it’s so easy to fall in love with the product once again. Microsoft is much more quiet or less noisy. Working hard on improving the product and realising the vast amount of changes nessesairy to put dots on I’s and crosses on T’s. The people who are new to the product need help getting started and Twitter is a nice medium to shout out for help. For that reason Microsoft started the hashtag #BCALHelp and encourages the community to subscibe to the hastag and help their peers. I think being the underdog, quietly working on being the best without bragging has been a position that was always good for Navision and it’s good for Business Central. Our customers are small entrepeneurs working hard and not as sensible for loud marketing as the big fortune 500. Let’s be humble and be the best in our game. Take it slow and steady and make the best with what we have. I’m sure that with that, NAVTechDays will soon be BCTechDays with the same pride and dignity as before serving the same great communities of Navision and Great Plains combined. See you all in Antwerp!
↧
Forum Post: RE: Select Variant on Sales Order
Did you check the "hidden fields" for sales lines?
↧
Forum Post: Getting Incorrect time in date-time field of Nav16 by SQL Query
Dear All, When i am getting data by SQL query from Nav16 database, i am getting incorrect time in date-time field. sample attached.Please help to solve the issue. Thanks in Advance,
↧
↧
Forum Post: RE: Getting Incorrect time in date-time field of Nav16 by SQL Query
This is likely due to the fact that NAV will save the time in UTC format, and convert them according to your regional settings. https://docs.microsoft.com/en-us/dynamics-nav/datetime-data-type
↧
Forum Post: RE: Transferfields command using with multiple linked tables
hi RedFoxUA, I have now resolved all the issues, except this one. In the codeunit this code is inserted on OnRun trigger: //Transfers a record of Job (always only 1 record). This is header table JobToRec.CHANGECOMPANY(CompanyTo); JobToRec.TRANSFERFIELDS(SourceJob); JobToRec.INSERT(TRUE); //Finds all records of linked table to header table. SlourcJobTaskF is linked table SourceJobTaskF.SETRANGE("Job No.",SourceJob."No."); //SourceJob is header table IF SourceJobTaskF.FINDSET THEN REPEAT TargetJobTaskF.INIT; TargetJobTaskF.CHANGECOMPANY(CompanyTo); TargetJobTaskF.TRANSFERFIELDS(SourceJobTaskF,FALSE); TargetJobTaskF.INSERT; UNTIL SourceJobTaskF.NEXT = 0; This means that SourceTable is the header table with only 1 records of the job (project) and in lines of the project (job) there are 3 records (entries). Code sets the range to those 3 records as expected, but when all 3 recrods are transferred I get an error: Microsoft Dynamics NAV --------------------------- The Job Task already exists. Identification fields and values: Job No.='',Job Task No.='' --------------------------- OK --------------------------- There were 1 blank record in the line table, and I deleted it, but still the error pops. Can you advice me on what else to check? Thank you BR Damjan
↧
Forum Post: RE: Transferfields command using with multiple linked tables
Hi, just to update. Work as expected, I re-run the test and works as expected. Thank you
↧
Blog Post: WARNING – The Data Upgrade Elephant
Last Thursday I was at the QBShare event in Veghel, Holland. I’ve been attending these events ever since I joined ForNAV a few years back and since the audience is a bit different from my normal events (CEO vs. Developers) it took me a while to get to know people. No matter who you talk to at these events, all that they have on their minds is moving their IP to Business Central and most are in the middle of that project, some with my help. This is all great and it’s cool to see peoples growing entousiasm but at this event I raised a question that for me is very obvious but to my surprise it was not for others. The reason for my question was this slide in the presentation from William van Voorthuijsen. Once partners are ready putting their IP on AppSource the next step is onboarding customers. Naturally you will start with customers who are as current as possible. These customers will want to migrate data, including transactions, to Business Central and in order to do that they first have to migrate their OnPrem NAV to BC15 with a more-of-less matching schema. WHAAT??? I hope that after following my blog nobody believes in the no-more-upgrade fairytales but this is a new chapter in this marketing bubble. Right now, a customer on NAV2015 for example must first pay the partner for an upgrade OnPrem and every 6 months they wait the upgrade will get more expensive. Remember that at the point of writing this blog, there is no direct upgrade path like we had in the old days where a customer could upgrade from Navision 2.01 to Navision 4SP3 without any issues. A friend of mine who does all of my upgrades even has his own tools that handles one-step datamigration from Navision 2.x to NAV2018 or Business Central Wave I. The upgrade from Wave I to Wave II is tedious at best since C/Side is retired. Now everything has to be done in PowerShell and Microsoft even started promoting doing stuff directly on the SQL Database in order to speed up the process. My recommendation to QBS, but also to Microsoft is to somehow make this process more affordable and guarantee that even if a customer wants to upgrade 5 years from now there is still a mechanism that allows doing that. I expect that in the next years the SQL Schema of Business Central will undergo many changes which is expensive fort he technology that Microsoft uses fort he upgrade. The good news is that Microsoft employs the smartest people in the world and they seem to have a subscription to my blog. Let’s see where this ends. I’ve already heard some rumours that this problem is under the attention of the teams and I hope this article helps leadership prioritise this issue. There is an elephand in the room so big, that nobody seems to see it.
↧
↧
Forum Post: RE: How to add open balance to trial balance report in nav 2018
I appreciate your answer. My client find it difficult to change. They believe in old system and they don't want to adapt change. But I have done it by designed new report while I use the startbalance definition in Detail Trial Balance for my guide. Thanks
↧
Forum Post: RE: How to add open balance to trial balance report in nav 2018
Hi Ayinde01, If I understand correctly, you just need to explain to your customer how the standard report works and then, add certain column on the report, if needed. And explain the dates meaning - order date, document date, posting date. I had a lot question why something is not displayed on this date. At the end, we found out, that customer wanted to see some values on order date, while report was showing data on posting date, and those 2 dates were different. So, customer said I do not see balance on this date. After I explained them what date is taken in consideration, customer accepted the user mistake and understand that posting date gives them the correct date of posted entry. I usually start at the order creation and set today's date and then, I release the sales order, and reopen it, make some changes and set date in future for +1 day, and again release it, then before I post the sales order, I change the posting date to +3 days and then we look at the customer ledger entries and BA entries. Then we do some case on posting of opening balances and usually as we migrate from old system to NAV, we post opening balance on test database where I explain each date fields on a case. Just keep it simple using a case on test/UAT database. I try to convince them to use standard report for opening balances, unless some additional data has to be displayed (usually new column to cover local financial needs has to be inserted on the report). Thank you BR Damjan
↧
Blog Post: Dynamics 365 Business Central: handling telemetry with Azure Application Insights
Handling telemetry is on of the most important activities that should always be done on a Dynamics 365 Business Central SaaS tenant. The classical way to inspect the telemetry of a tenant is going to the Admin portal of your tenant, access the Telemetry tab and here inspect the data (by applying filters and so on). Starting from Dynamics 365 Business Central version 15, you can setup your environment to send telemetry data to Azure Application Insights service. I’ve talked about this service in my Azure Function webcast some weeks ago. Azure Application Insights permits you to monitor your azure resources and detect anomalies. It includes powerful analytics tools to help you diagnose issues and to understand what users actually do with your cloud applications. This feature is actually in preview. To start sending the telemetry data of your Dynamics 365 Business Central tenant to Azure Application Insights , you need to first create an instance of this service to your Azure subscription. Just go to the Azure Portal and create a new Application Insights instance as in the following picture: When the Application Insights instance is provisioned, you have an Instrumentational Key (top right corner in the Overview pane). Take note os this key: To enable your Dynamics 365 Business Central tenant sending telemetry data to Application Insights you need to go to the tenant Admin portal, select your environment, click on the Application Insights key button and then in the opened window click on Enable Application Insights and insert the relative I nstrumentation Key : When you click on Save , the environment is restarted (do that on non working hours) and after that your tenant is ready to send telemetry data to your Azure Application Insights instance. Actually, Dynamics 365 Business Central sends only telemetry data related to long running SQL queries, alias any SQL query that takes longer than 1000 milliseconds to execute (as I’ve said before, the feature is in preview state). More data will be added in the future. From the Application Insights Overview dashboard you can start analyzing your telemetry data (graphically or by querying your data): Here, as an example I’ve clicked on the Server response time section on my dashboard in order to see the details of the queries with a long response time (sent from the telemetry). From this details, you can also select the View in Logs (Analytics) section and then enter into a query editor to analyze your telemetry data more in depth: This is the real powerful feature of Azure Application Insights : you can use KUSTO query language in order to query all your telemetry data as needed. For example, here I want to analyze all my long running queries logged in the last 3 days: By selecting a record from the resultset, you can inspect the details of the selected long running query (client type, object that cause the long running query, code executed and so on): Extremely powerful isn’t it? Now the common question: Is the Azure Application Insights telemetry integration available only for the SaaS enviroment? No. You can use Application Insights also for your on-premise installations. For the on-premise world, you can inject the Application Insights Instrumentation Key into your NST instance by using the following Powershell cmdlet: Set-NAVServerConfiguration -KeyName ApplicationInsightsInstrumentationKey -KeyValue -ApplyTo All -ServerInstance If you have a multi tenant environment: Mount-NAVTenant -AadTenantId -DatabaseName -Id -DisplayName -ApplicationInsightsKey Happy telemetry inspecting
↧
Forum Post: RE: Transferfields command using with multiple linked tables
it is good that your code works and it is OK P.S. 1 small remark - I can see you use different ( JobToRec .TRANSFERFIELDS( SourceJob ) and TargetJobTaskF .TRANSFERFIELDS( SourceJobTaskF ,FALSE) ), but not the same record (table) like you wrote before
↧
↧
Forum Post: RE: Changing the posting date while ` reversing the payment entries
If I were you I select Cu17 "Gen. Jnl.-Post Reverse" function "Reverse" as a start point
↧
Forum Post: Error while viewing Statistics of Sales Order in BC on Cloud
Hi All, I am getting below error while viewing SO statistics. When I checked in table, there is no line with number 230001. Can someone tell me how to check/resolve this issue? Thanks in advance
↧
Forum Post: Kitchen display monitor setup for NAV2013R2
Hi, I am using NAV2013R2 with LS and currently using KDS Printers. we want to replace KDS printers with KDS monitor. please guide. Thanks.
↧
Forum Post: RE: Kitchen display monitor setup for NAV2013R2
This forum is more for "standard" NAV/BC so you are likely not getting any positive response for a question like this as LS is an add-on. So you should really ask LS on their support site. Thanks
↧
↧
Forum Post: RE: Kitchen display monitor setup for NAV2013R2
Hi Nayans, Check this link from LS Retail. https://help.lscentral.lsretail.com/Content/LS%20Hospitality/Kitchen%20Display%20System/KDS%20Setup.htm I t's written for latest LS Central version, but it should also be helpful for any older version. Thanks Damjan
↧
Forum Post: How to get the last value in a table in dynamics nav
Commits.RESET; IF Commits.FINDLAST THEN ImpDetails.RESET; ImpDetails.SETRANGE(ImpDetails."No.","No."); IF ImpDetails.FINDSET THEN BEGIN REPEAT Commits.INIT; Commits."Entry No" := Commits."Entry No" + 1; Commits."Document No" := ImpDetails."No."; Commits."Beneficiary Name" := Payee; ImpDetails.TESTFIELD("Global Dimension 3 Code"); ImpDetails.TESTFIELD("Budget Name (Accrued Budget)"); Commits."Department Code" := ImpDetails."Global Dimension 3 Code"; IF ImpDetails."Request Amount" <> 0 THEN Commits.Amount := ImpDetails."Request Amount"; //Start: Check for the account no ImpDetails.TESTFIELD("Account No"); Commits."Account No" := ImpDetails."Account No"; Commits.Expenses := ImpDetails.Expense; //Start: Check Budget Dimension 2 Code = Property Code is not equal to empty IF (ImpDetails."New Type" = ImpDetails."New Type"::"Capex Codes") THEN BEGIN ImpDetails.TESTFIELD("Budget Dimension 2 Code"); END; Commits."Property Code" := ImpDetails."Budget Dimension 2 Code"; //Name:IB Date:07/07/2019 Commits."Budget Name" := Rec."Budget Name (Accrued Budget)"; //End: Check // commute budget balance commitReg2.RESET; commitReg2.SETFILTER("Account No",ImpDetails."Account No"); commitReg2.SETFILTER("Department Code",ImpDetails."Global Dimension 3 Code"); commitReg2.SETFILTER("Budget Name (Accrued Budget)",ImpDetails."Budget Name (Accrued Budget)"); commitReg2.SETRANGE("Property Code",ImpDetails."Budget Dimension 2 Code"); //IF commitReg2.COUNT = 0 THEN BEGIN IF commitReg2.ISEMPTY THEN BEGIN GLBudget.RESET; GLBudget.SETRANGE("G/L Account No.",ImpDetails."Account No"); GLBudget.SETRANGE("Budget Dimension 1 Code",ImpDetails."Global Dimension 3 Code"); //Name: IB, Date: 07/07/2019 GLBudget.SETRANGE("Budget Dimension 2 Code",ImpDetails."Budget Dimension 2 Code"); //Name: IB, Date: 07/07/2019 GLBudget.SETRANGE("Budget Name",ImpDetails."Budget Name (Accrued Budget)"); //Name: IB, Date: 07/22/2019 GLBudget.CALCSUMS(Amount); Commits."Budget Balance" := GLBudget.Amount - ImpDetails."Request Amount"; END ELSE BEGIN IF commitReg2.FINDLAST THEN Commits."Budget Balance" := commitReg2."Budget Balance" - ImpDetails."Request Amount"; END; IF "Advance Type" = "Advance Type"::"Touring Advance" THEN Commits.Purpose := COPYSTR("Purpose/ Destination",1,100) ELSE IF "Advance Type" = "Advance Type"::"Cash/Purchase Advance" THEN Commits.Purpose := COPYSTR("Purpose of Request",1,100) ELSE IF "Advance Type" = "Advance Type"::Imprest THEN Commits.Purpose := COPYSTR("Purpose of Request1",1,100) ELSE IF "Advance Type" = "Advance Type"::"Payment Request" THEN Commits.Purpose := COPYSTR("Purpose of Request2",1,100) ELSE IF "Advance Type" = "Advance Type"::"Staff Claim" THEN Commits.Purpose := COPYSTR("Purpose of Request3",1,100); //End Commits.INSERT(TRUE); UNTIL ImpDetails.NEXT = 0; END; MESSAGE('Successfully commited to the register'); Hi Team, the above code is meant to check the last value and do some calculate but it was not, there is some value in the table but it will skip it. Thanks
↧
Forum Post: RE: Transferfields command using with multiple linked tables
Hi, It is the same, I renamed it to more meaningful name. Thanks
↧