Invicta Stores

About Invicta company

Invicta Stores is an official reseller of the Invicta Watch Company of America. Invicta Stores operates InvictaStores.com, and numerous physical retail locations across the United States, including the territory of Puerto Rico. To manage their back-office operations, Invicta Stores utilizes Microsoft Dynamics NAV for track inventory levels, manage purchase orders, and automate many aspects of their supply chain.

In collaboration with Tenging, I had the opportunity to design and develop solutions to optimize Invicta Stores business processes.

Invincible Guarantee

Invicta Stores has created an innovative program called the Invincible Guarantee. This program allows customers to trade in their Invicta watch for credit towards the purchase of a new watch. This program has been extremely popular with customers, as it allows them to upgrade their timepiece while also enjoying significant cost savings.

The Invincible Guarantee is managed through a dedicated web portal, which is integrated with the Dynamics NAV via BC365 Connector. Customers can use the portal to initiate a trade-in, and the system automatically generates a quote based on the condition of the watch. If the customer accepts the quote, they can then apply the credit towards the purchase of a new watch.

Official website of the program: https://invincibleguarantee.com

Invicta Loyalty

To provide customers with an enhanced shopping experience with Invicta Stores I have developed a mobile application that allows customers to access their loyalty accounts, collect more points, and avail exclusive offers and coupons.

The mobile application is designed to be user-friendly, with an intuitive interface that allows customers to navigate easily and quickly. With the app, customers can view their loyalty points balance, track their rewards, and redeem them for exclusive offers and coupons. T

he frontend part of the application was developed using Quasar Framework. The backend part of the application was developed using .NET Core NAV middleware.

My role in Invincible Guarantee and Invicta Loyalty projects

  • Azure VM/IIS installation and web application setup.
  • Network setup, manage DNS records and SSL certificates.
  • Solution architecture design.
  • C/AL Codeunit development and deployment.
  • API design and implementation.
  • Design and development of .NET Core NAV middleware.
  • CSS/HTML template layout from Figma.
  • Frontend development using Quasar Framework.
  • Git source code management and solution deployment.
  • Customer support, features implementation, versioning.

Using Quasar Framework with Dynamics NAV/Business Central

The Quasar Framework is a powerful open-source cross-platform framework based on Vue.js. Vue.js is trusted by numerous high-profile companies such as Alibaba, Adobe, Grammarly, GitLab, Netflix, and Xiaomi. The primary objective of this framework is to enable developers to “write code once and deploy it simultaneously as a website, a mobile application, and/or a desktop application.”

This framework offers various flavors for creating responsive web applications such as:

  1. A SPA (Single Page Application/Website)
  2. A SSR (Server-side Rendered App/Website)
  3. A PWA (Progressive Web App)
  4. A BEX (Browser Extensions)
  5. A Mobile App (through Cordova or Capacitor)
  6. An Electron App

Quasar Framework proves to be a useful tool for extending the functionality of Microsoft Dynamics NAV/BC. It excels at building user-friendly interfaces for interacting with NAV, making it ideal for various applications such as e-commerce solutions, self-service kiosks, POS terminals, inventory control systems, and much more. By leveraging the Quasar Framework, developers can build custom solutions that enhance NAV’s capabilities while providing an intuitive and seamless user experience.

The figure above illustrates a sample architecture for integrating Quasar Framework with Dynamics NAV/Business Central. The middle tier in this architecture can be either Node.js BC365 middleware or .NET Core NAV middleware.

Examples of integration Quasar Framework with Dynamics NAV/BC.

Tokyo Sushi e-commerce solution

Web-based point of sale 

Appointment scheduling tool

Node.js BC365 middleware

Connecting Node.js with Dynamics 365 Business Central

In this article, I will guide you on how to transfer data between Dynamics 365 Business Central and Node.js in JSON format. The approach to creating a Node.js BC365 middleware is similar to the one in our previous article on creating a .NET Core NAV middleware. In that article, we described how to publish a codeunit with an entry point function “GetJSON” using SOAP and OData V4.

This article will focus on the following topics:

  1. How to transfer JSON data between Node.js and BC365.
  2. How to consume a published codeunit and build an API to manage BC365 data using Node.js.
  3. How to leverage AWS Cloud and serverless architecture to host a Node.js application.
  4. How to use Bitbucket and Pipelines to automate publishing and development processes.

Creating GetJSON entry point function

In Dynamics 365 Business Central, it is feasible to expose a function as a web service action using OData V4. This represents a more advanced and modern approach to interacting with 365 Business Central, surpassing the SOAP Web services method.

  1. Using Visual Studio Code, we can create codeunit with GetJSON function. This function takes method and request as input parameters and returns text in JSON format. GetJSON is the central entry point, and depending on the method, it distributes data to the internal functions of the Codeunit..
  2. As an example, we can create a Ping function that simply sends to the output whatever it receives as an input parameter.
  3. Codeunit is published as an OData V4 service.

Using node-fetch library to retrive data from GetJSON function.

  1. Once we’ve published the Codeunit via OData V4, we can access it using the URL https://BaseUrl/NAVMiddleTierName/ODataV4/CodeunitName_FunctionName. The JSON object we pass in the body should match the input parameters of the GetJSON procedure, which includes the method and request in text format.
  2. To ensure a successful POST call, we need to include special headers, such as the Authorization and Company name.
  3. We can use the node-fetch library to retrieve the JSON data from BC365 by making a call to the URL.

Building API to manage BC365 data with node.js.

After successfully receiving BC365 data in Node.js, the next step is to build an API and expose it for implementation by other systems. One way to do this is to use the AWS cloud to host the Node.js application and a serverless model to manage it. By using serverless architecture, we can focus on writing code rather than managing infrastructure. AWS provides several services, such as AWS Lambda and API Gateway, that can be used together to create and deploy serverless APIs quickly and easily. With a serverless approach, we only pay for the resources our application consumes, and we don’t have to worry about scaling, availability, or security.

  1. We can use the express.js library to write a controller that manages incoming HTTPS requests and passes them to the BC365 Codeunit function GetJSON. This library allows us to configure routing names, request types, error handling.
  2. After wrapping, the API looks more friendly for consumers and requests can be called anonymously. We can even add caching, JWT authorization, and other features at this level.
  3. With the API wrapped, it’s ready for frontend developers to use in building web and mobile applications. For example, they can use the API to get product information and display it on a web page.
  4. To host the node.js application on AWS, we can use a serverless. yml file to describe the application’s functions, resources, plugins, and other necessary configuration information. This prepares the application to be hosted on AWS Lambda using a serverless model.

Hosting Node.js BC365 middleware as AWS Lambda.

AWS Lambda is a serverless compute service that enables running code without the need to provision or manage servers. Serverless architecture allows for launching apps only when needed. When an event triggers the code to run, the public cloud provider dynamically allocates resources for the code. The user stops paying when the code finishes executing, leading to cost savings and greater efficiency. Additionally, serverless architecture frees developers from the routine and menial tasks associated with app scaling and server provisioning, allowing them to focus on writing code and building applications.

  1. The Lambda service comes with seven metrics for your functions out of the box. Invocations, duration, error count, throttles, async delivery failures, iterator age, and concurrent executions.
  2. The function overview feature helps you to see triggers, layers, and destinations associated with your function. You can also view AWS services or resources that invoke the function, resources that contain libraries, custom runtime, or other dependencies.
  3. CloudWatch Logs allows you to centralize the logs from all your systems, applications, and AWS services that you use, in a single and highly scalable service

Using bitbucket pipelines to deploy Node.js BC365 middleware.

Bitbucket Pipelines is a comprehensive CI/CD service that comes integrated with Bitbucket. It allows developers to automate their build, testing, and deployment processes using a configuration file in their repository. This configuration file is called bitbucket-pipelines.yml and is usually located at the root of the repository.

With Bitbucket Pipelines, developers can define a pipeline that consists of multiple stages, each of which can have one or more jobs. These jobs can be used to build and test the code, generate artifacts, and even deploy the code to different environments.

By setting up a Bitbucket Pipeline, developers can streamline their development workflows, ensure the code quality, and rapidly deliver their applications to production.

 

.NET Core NAV middleware

Transfer data between .NET Core and NAV

In this article, I will discuss different methods to transfer data between .NET Core and NAV in JSON format. We will also explore how to consume published NAV codeunit using .NET Core and build an external API to manage NAV data.

The implementation of .NET Core NAV middleware can vary based on the NAV version, authentication types, and data specifics. In this article, we will discuss three different ways of implementing the middleware.

.NET Core NAV middleware based on SOAP web services and WCF

This approach works for NAV2009-NAV2018 when codeunit is published as SOAP web service.

  1. The core of our .NET Core NAV middleware is the Codeunit, which exports the external function GetJSON. This function takes Request and Param as input parameters and returns ReturnJSON in JSON format. GetJSON is the central entry point, and depending on the Request, it distributes data to the internal functions of the Codeunit.
  2. For instance, an internal function, GetItems, returns a list of products in JSON format using the Newtonsoft.Json library as a .NET add-on.
  3. To make our Codeunit available to .NET Core, we publish it as a SOAP Web service with a specific URL.
  4. By opening this URL in a browser, we can see the WSDL metadata of the published GetJSON function.
  5. To connect the published Codeunit, we use Visual Studio tools and CoreWCF, which is a port of the service-side of Windows Communication Foundation (WCF).
  6. The WCF library generates all the necessary proxy classes automatically.
  7. Using the generated proxy classes, we create a ServiceRepository class, which manages the connection channel.
  8. In the ServiceRepository class, we create a wrapper over the GetJSON function to establish a connection between the NAV middle tier and the .NET Core NAV middleware.

.NET Core NAV middleware based on SOAP web services and System.Net.Http.HttpClient

To establish a more convenient and lightweight connection with NAV in certain scenarios, one can avoid using WCF. By employing basic authorization and maintaining the GetJSON function signature, the System.Net.Http.HttpClient library can be utilized for sending simple POST or GET requests in XML format. This approach can streamline the process and offer a simpler solution for certain use cases.

  1. To communicate with a published SOAP service, we can directly provide an XML envelope that adheres to specific rules. The JSON data from the <return_value> XML tag can be easily extracted from the request response body.
  2. To pass authorization in the SOAP service, we need to provide the login and password encoded in a base64 string as the Authorization header. Additionally, we should include the codeunit name and function name as the SOAPAction header.
  3. In Postman, we can enter the login and password for Basic Authorization in the Authorization tab.
  4. Using the .NET System.Net.Http.HttpClient library, we can create a POST request, pass the necessary headers, and the XML envelope in the body to retrieve data from the SOAP service.

.NET Core NAV middleware besed on OData V4 Unbound Action and System.Net.Http.HttpClient

From Microsoft Dynamics NAV 2018 and onwards, it is feasible to expose a function as a web service action using OData V4. This represents a more advanced and modern approach to interacting with NAV, surpassing the SOAP Web services method. OData V4 is widely adopted to connect to Business Central.

  1. In Visual Studio Code, we can create an entry point for the GetJSON function, utilizing the same approach mentioned earlier to distribute data between internal functions of the codeunit.
  2. As an example, we can create a Ping function that simply sends to the output whatever it receives as an input parameter.
  3. Codeunit is published as an OData V4 service.
  4. We can call codeunit function by URL /NAVMiddleTierName/ODataV4/CodeunitName_FunctionName. However, it’s important to include basic Authorization and Company in the request headers.
  5. By passing a JSON body to the POST request, we can receive a response from the Ping function.
  6. With the .NET System.Net.Http.HttpClient library, we can create a POST request and include the necessary headers and JSON data in the body.

Building API to manage NAV data with .NET Core.

With the three methods outlined above, we have successfully established a connection between .NET Core and the NAV codeunit. The next step is to create an API for frontend clients such as web and mobile applications, or other systems that require NAV integration.

To accomplish this, we can utilize the ASP.NET Core MVC framework to implement the API, user interfaces, data, and control logic.

  1. Here’s an example of an MVC controller that receives the Request parameter as a NAV method name, and a JSON string in the body. It passes this data to the GetJSON function.
  2. Frontend applications can leverage this controller by calling the GetItems method, which enables them to retrieve a list of items directly from NAV and draw them on the screen.

Extending .NET Core NAV middleware

In this article, we have explored several methods on how to use .NET Core to extract data from NAV in JSON format and transmit it to client applications. However, the concept of .NET Core NAV middleware goes beyond being a mere data transmitter. It serves several additional purposes, such as:

  1. Load balancing for a large number of requests
  2. Request caching in internal memory
  3. Intermediate storage of files and pictures
  4. Authentication and data protection
  5. Sending email or SMS notifications
  6. Integration with payment systems
  7. Logging and monitoring tools

These functionalities can be implemented as plug-ins and added to the middleware, based on the customer’s specific requirements. By leveraging the .NET Core NAV middleware, developers can create custom solutions that not only provide seamless integration with NAV but also offer additional features and functionalities to meet the client’s specific business needs.

Tenging verslunarlausnir ehf.

About Tenging

Tenging is a leading consulting compant that provides development and support to clients managing Microsoft Dynamics Business Central solutions for the retail and hospitality sectors in Europe and the USA. With a team of 40 skilled professionals who work remotely from 8 different countries. The company’s headquarters is based in Vilnius, Lithuania.

As an early employee at Tenging, I played a pivotal role in laying the foundation for the company’s growth. With my expertise in architectural solutions, I helped streamline operations and increase the company’s revenue by 20 times in just 5 years. This growth also led to a significant increase in the number of employees, expanding the team from just 2 to 40 talented individuals spread across the globe.

Working at Tenging has been an incredibly rewarding experience. Collaborating with a team of experts from diverse backgrounds has helped me expand my skillset and broaden my perspectives. I take pride in being part of an organization that is dedicated to providing top-notch services and delivering value to its customers.

Azure infrastructure

In my first project, I was responsible for developing and deploying the complete company infrastructure on Microsoft Azure. My primary duties and accomplishments during this project included:

  • Deploy Virtual Machines for application and database layers.
  • Installation and configuration Active Directory Domain Controller.
  • Setup organizational security polices, rights and permissions.
  • Installation and configuration MSSQL servers, database transferring, setup backup polices.
  • Setup IIS servers, domains names administration, issuing SSL certificates.
  • Installation and managing NAV Middle tiers versions from 2009 to Business Central.
  • Virtual networks setup and administration.
  • Network interfaces security policies configuration.
  • Configuration backup polices for virtual machines and databases.

Project management tool

My next challenge at Tenging was to develop a project management tool that would help streamline the daily work of our employees. I created a powerful tool that allows colleagues to easily create and assign tasks, add comments, attach files, and track their work hours.

The project management tool leverages NAV2013 as its backend and incorporates its project management business logic. When users log their hours, the record is automatically saved in the Time Sheet Line table in NAV, allowing for seamless financial optimization using the NAV Financial Management module.

To enhance the functionality of the tool, we integrated it with popular software such as Teams, Outlook, Sharepoint, and Azure AD. This allowed for better collaboration among team members and streamlined workflows.

The task management tool was built using .NET Core NAV middleware and the Quasar Framework, ensuring a robust and scalable solution that meets the needs of our growing business.

Mobile application for task management tool

For my next project, I was tasked with converting the Task Management System into a mobile application using the Quasar Framework. This framework enabled me to utilize the same codebase for building web, mobile, and desktop applications. Throughout this project, I acquired valuable experience in using Xcode and Android Studio to develop and debug mobile applications. Additionally, I became adept at submitting applications for review and successfully navigating the review process for both Google Play and App Store.

Unified API Monitoring Tool.

Tenging has implemented a Unified API Monitoring Tool that consolidates all the client solutions developed and published by the company. This powerful tool enables our support center to quickly identify and resolve communication issues in real-time, ensuring seamless and uninterrupted services for clients.

With the Unified API Monitoring Tool, our team can monitor all client systems and applications from a single dashboard. The tool offers a comprehensive range of features such as real-time alerts, performance tracking, and detailed analytics.

Data MATRIX

About Data MATRIX

Data MATRIX is a renowned contract research organization that has successfully conducted hundreds of clinical trials in Eastern Europe. With a team of 130 experienced professionals, the company provides phase I-IV studies and therapeutic expertise in areas such as oncology, infectious disease, and all major therapeutic indications.

In 2007, I joined Data MATRIX as a developer after working for a year at r_keeper. The transition to this new role exposed me to a completely new set of business processes and opened up a whole new world of clinical trials.

Electronic data capture 1.0

My first project at Data MATRIX was to understand and make changes to the existing EDC 1.0. Understanding the architecture of the application was one of the most challenging tasks in my career. The first level of EDC is a two-dimensional matrix that includes patients and events. Next, there is a submatrix of patients and event forms. Each form is made up of groups, and each group consists of sets of form inputs. Additionally, events, forms, and groups can dynamically change their number. Each form has a dynamic set of validation rules, and there are many other complex features to consider.

MATRIX Cloud

The next project I worked on at Data MATRIX was the MATRIX Cloud, which was a highly complex all-in-one SaaS solution for clinical trials. It was a multi-module, cloud-based solution that aimed to streamline and simplify the clinical trial process.

Electronic case report form designer (CRF Designer)

My main contribution to the MATRIX Cloud project was the development of the CRF-designer module. This application allows CRF (Case Report Form) designers to create and manage the forms used to collect data in a clinical trial in a user-friendly and efficient way. The CRF Designer provides an intuitive interface for designing forms with customizable input fields, validation rules, and branching logic. It also includes features for version control, collaboration, and multi-language support. With the CRF Designer, clinical trials can be designed and launched more quickly and with greater accuracy, reducing the risk of errors and saving time and resources.

Electronic data capture 2.0

One of the modules I developed for MATRIX Cloud was the new EDC 2.0, which was built based on the forms created in the CRF Designer. In other words, the CRF Designer allowed users to design forms, and EDC 2.0 enabled them to collect and store data using these forms. This allowed for a more streamlined and efficient data collection process for clinical trials.

After two years of development, MATRIX Cloud was successfully released into production for conducting clinical trials related to:

Electronic Patient Reported Outcomes (ePRO)

After working on the MATRIX Cloud project for 3 years, I was promoted to lead a small team of 4 people. Our team was tasked with developing an ePRO mobile application, which would allow patients to conveniently participate in clinical trials using their mobile devices.

Mobile application was successfully launched on the study of: Decrease or Loss of Sexual Desire (NCT03463707)

From 2014 to 2017, DataMATRIX underwent rapid growth with the release of a new product. During this time, the company moved into three progressively better offices and increased its staff from 30 to 130 professionals. My role in the company also expanded, allowing me to develop new skills in architectural design and people management. However, in 2018, I left DataMATRIX after nearly four years of working there. It was then that my friend invited me to join him at his startup, Tenging.

R_keeper automation system

About r_keeper

R_keeper is a leading software developer in hospitality and entertainment enterprise automation. Today, over 65,000 restaurants in 53 countries around the world use r_keeper for their business automation needs.

I joined R_keeperi n 2013 after working for JSC Concern Sozvezdie. At that time, the engineers at r_keeper were actively transitioning all their products from desktop versions to web and cloud-based solutions. During my time there, I gained a deep understanding of the business logic that underpins the restaurant industry.

Manager Station

I worked on a significant project at r_keeper, called the Manager station. This software enables restaurant owners to set up point-of-sale (POS) systems, manage staff access to checkout operations, and generate reports on the restaurant’s key performance indicators.

Store House

he second project I worked on at r_keeper was called Store House. This module provided comprehensive automation of production management, enabling efficient cost management, streamlined procurement processes, and enhanced staff control. By using this module, businesses could track inventory levels, analyze usage patterns, and optimize supply chain processes, reducing waste and maximizing efficiency. Overall, Store House helped businesses to operate more smoothly and effectively, and was an essential component of r_keeper’s suite of enterprise automation solutions.

During my time at r_keeper, I was able to expand my knowledge in web development, both on the backend and frontend sides. In the backend, I gained experience working with technologies such as C#, ASP.NET MVC, SOAP, XML, Fast Report, and MSSQL. On the frontend side, I deepened my understanding of technologies such as HTML, CSS, JavaScript, jqGrid, BootMetro, and jQuery Mobile.

Working at r_keeper solidified my belief that web technologies are the future, as many companies are transitioning their products to the web. I left the company in 2014 due to a relocation to St. Petersburg, where I continued my career as a full stack developer at DataMATRIX.

Kodofon

About Kodofon

Kodofon performs research and development (R&D) in broadband wireless access technologies, development and design of infocommunication systems, development of algorithms and software to create innovative products using technologies based on CDMA, GNSS, WiMAX, Wi-Fi, LTE standards.

My Journey to Kodofon

I began my career as a software engineer in Kodofon in June 2007, which was a very exciting period in my life. I was a young student with big aspirations.

During my time in university, I was determined to get a job at Kodofon. Our professor, Dr. Andrey Savinkov, who taught the Operating Systems course, worked in Kodofon as the head of department. He was the most intelligent and erudite person I have ever met, and every student dreamed of working in his department. I knew I had to stand out from the crowd, so I set out to write the best course work on the most challenging topic I could find.

I spent three months writing my project, taking breaks only for food, sleep, and university. Two books became my new bibles: Windows Internal by Solomon and Rusinovich, and The C Programming Language by Kernighan. The title of my project was “Protection Against Unauthorized Access Based on Covert Keyboard Monitoring.” It consisted of keyboard and mouse drivers and a desktop application. In training mode, the system analyzed keyboard handwriting, while in protection mode, the system could block the mouse and keyboard on a hardware level if it determined that the keyboard handwriting did not match.

After passing my course work with excellent marks, the professor noticed me and invited me for a trial period in Kodofon. It was an incredible opportunity for me to work with smart and passionate enthusiasts. The atmosphere was electric, and I was ready to pay money to come to work. In our department, about 60 people worked, and 70% of them had a Ph.D. We had done projects for Huawei and Samsung, and I am confident that the algorithms written by our mathematicians are working in the GPS chips of millions of Samsung phones.

Vehicle tracking system

One of the projects I worked on was the Vehicle tracking system in 2008. My colleagues were talented in developing, soldering, designing, and assembling our first tracker. My part was to build the server and the client application. This time, books like “UNIX Network Programming” by Stevens and “The C++ Programming Language” by Stroustrup became my new bibles. I discovered the incredible world of TCP/IP, including protocols, threads, sockets, and routing, among other things. I wrote my first server application for FreeBSD, which could receive coordinates from trackers, store them in a database, and pass them to client applications.

Server management application

The next step was a management application that helped administer the server application. It was written in cross-platform Qt.

Client application

For the client side, a Win32 application was written in pure C++ and lightweight WTL. The program could display the position of the trackers on the map in real time, download the history for the selected period, analyze the data, and much more.

Mobile application

Long before the advent of Google Play and the Apple Store I developed a J2ME mobile phone client application that worked fine even on my Nokia E51.

Web interface

In addition to developing the server and client applications for the Vehicle tracking system project, I also had the opportunity to work on the web application. This was my favorite part of the project, as it allowed clients to easily monitor the trajectory of movements on Google Maps. It was a thrilling experience to work with web development, and it quickly became my favorite aspect of the IT field.

For the backend, I utilized ASP.NET 3.5 and connected it to MSSQL, while the frontend was developed with JavaScript and jQuery. During this time, I gained a deep understanding of web development and realized that the web is the future of technology. I knew that I wanted to be a part of that future, and I became determined to further develop my skills in web development.

Subsequently, based on my work and the work of my colleagues, a separate company was created. The operator of monitoring services named “Tracking LLC”. A huge number of contracts were concluded with public transport companies and freight operators. An agricultural module has been developed. The company has equipped combines and tractors with trackers, connected them to sensors and units and much more.

I’m proud to have helped build a successful business that brings good profits. My time at Kodofon taught me a lot about building complex systems and bringing them to life. It’s an amazing feeling to see a program you developed begin to live and benefit people. And the secret to achieving this is putting your heart into the code.

However, the most important part of the experience was the people I worked with. My colleagues were always supportive, helpful, and taught me so much. They were like a second family to me, and I’m grateful for the time we spent together during those 5 years.