One more time, I found easy to use browse Rust Standard Library from Visual Studio Code. I wanted to do the same with Neovim. There is a rusty-tags project to do this.
After generate correctly rusty-tags.vi for Rust standard library you can follow code from Neovim:
v3.push Go to definition with Ctrl-}fn push() implementation…
However, using Visual Studio Code, shows me the correct fn push() implementation:
A concept I found interesting coming from the C++ world in Python is the parameter passing convention and the particular behavior with a mutable default argument such as a list.
In C++ the arguments can be passed by value or reference. This is clear and such arguments are pushed into the stack. In the other side, in Python, parameters are passed by object or call-by-object. A parameter is passed by reference, except when such parameter is updated then the parameter is passed b y value. Kind of weird, isn’t?
For mutable objects, such as a list, individual elements can be changed at the caller’s level.
Careful about having a default list argument as this list object will be created once in function definition then reused in every call!
In C++ there is something weird when having default arguments with virtual functions. These are not part of signature. Default values are evaluated at compile time.
If base class calls virtual function, in compile time the default value will be substituted accordingly to the defaut value of the object that points.
I watched this interesting Video from C++ creator Bjarne Stroustrup:
Learning and teaching modern C++.
In this video, Bjarne mentions what hard is to start interesting projects in C++ such as GUIs and Database development mainly because of lack of tools easy to use, specifically at 37:50 he stresses that for his undergrad students, when trying to do GUI development “the greatest problem was to get the GUI library installed”. “It is painful”.
C++ Real World Apps
Historically, C or C++ development has been difficult, add different compilers, operating systems, libraries, build systems and tools all of them different between them.
In this post I did a “quick” exercise to tests those sentences. My first try was to setup an environment to develop C++ with GTK+, spoiler alert: It was not quick, it was not easy even for me having experience in the C/C++ world. My second try was with mongodb library in Fedora Linux. In this post I will show the setup I have to work and be more productive with this “kind” of development. The first challenge was to get an editor providing intellisense auto completion and source code browsing
In previous posts I showed how to setup Neovim to provide C++ intellisense using Programming Language Server (PLS) with Coc plugin. I did not realize that for third party libraries you need to do further setup.
There are 2 ways to use intellisense for third party libraries with PLS and Coc plugin: clangd and ccls. I tried with ccls and I was not able to make it work, so I stuck with clangd. After many tries I got to use Intellisense with Neovim for mongodb libraries
Nvim coc-clangd. MondoDB intellisense..
I did the same for Visual Studio Code:
Visual Studio Code. MongoDB intellisense
I have used most of the time vi for code editing, but now we have more options, so I setup both nvim and Visual Studio Code. Nvim I like to use it for quick browsing, code editing. For something more complex I would use Visual Studio Code Editor.
Visual Studio Code vs Nvim advantages:
What I really like from Visual Studio Code is that I can jump to system libraries and third parties source code definition. I can do similar with ctags or rtags in Neovim but in Visual Studio Code is out of the box.
Extensions are more friendly in Visual Studio Code than the ones for the Coc plugin in Neovim. Setup is harder in Neovim.
Debugging is also out of the box with Visual Studio Code. When using only vim I’m used to use gdb.
C++ debugging with gdb. mongocxx::uri
How it goes with Visual Studio Code integrated debugger:
C++ debuging with gdb, but inside Visual Studio Code!
Setting up libraries and IDE for third party libraries in C++ is not easy. That is the C++ world! For the libraries setup
I watched LinkedIn training “DevOps Foundations” with Carlos Nunez instructor. In section 3 he shows the use of RSpec and Capibara. I’ll extend a little bit as that section assumes a lot of things from the instructor side.
Step number 1 launches a Docker container with the Nginx web server. That is ok. Then he starts a second container which is Selenium.
Both containers use docker-compose. Nginx is launched with a Docker file. Selenium is not, instead just a raw image is downloaded.
Instructor launches a third container, which contains Rspec and Capybara. This container is specified again with a docker file
What is omitted is an explanation of Selenium, Capybara and RSpec. So let’s start.
Selenium is a web browser instance with an API tailored to be run from a container. The image used here also has a VNC server which is useful when using containers.
Capybara is a Test Framework with support to multiple languages.
RSpec is the ruby DSL.
Why we are using this combo is not explained.
Let’s see my containerfs launched with docker-compose:
I have customized my Linux Terminals. Terminals I use are Terminator and Terminology. I use zsh Shell and Oh-My-Zsh framework to add some cool functionality.
For instance, a cool prompt with git integration. I see a git notification in the prompt telling me that 1 file was modified and 11 files are not tracked. It is a summary from git status.
Cool command line plus git integration.
Nice Icons showing folders and file types …
Cool icons associated to files
Tab completion. Type tab key to complete path. You can use arrow keys to navigate files and folders.
Tab completion and navigation.
Command line Fuzzy finder. Type Ctrl-r in terminal then type command to look for. You will get a list of previous commands typed before associated to search.
Type Ctrl-r then pattern. You have option to navigate with arrow keys
Plugins. The git plugin, for instance, shows me all the options for git when typing git then Tab Tab.
There is no doubt, coding is getting more complex nowadays. I have to code in Python, C and C++ and I have usually used vi for small edits. For production I have used PyCharm, for C, C++ Visual Studio.
Visual Studio Code is a great editor with multiple extensions to support different languages. Although, I am used to vim and I always tried to setup intellisense to support different languages.
I like to use Visual Studio Code, however, I feel more productive with vi editor. In this post I’m setting up NeoVim with Coc to code in my favorite languages. In the past I tried vim with different plugins to have code completion.
Finally, I think NeoVim plus Coc is a good option to use vim.
COC requires nodejs and yarn. So, first step is to install those tools:
Launch NeoVim and manage code completion from inside NeoVim. This is what I like more about COC, you can install all the code completion extensions using CocInstall:
Neovim. Show Installed Coc Extensions
CocInstall coc-rust-analyzer and others....
CocInstall Output
PYTHON
Notes: Set interpreter to python 3. Optional use Jedi or language server. I’ll use Jedi:
I had to remove python3-jedi package from Fedora and install it with pip3. You can use CocConfig to setup json file and toggle jedi or language server.
I was used to SetUp VirtualBox Guests with a Network Bridge in order to have bidirectional access from Host to Guest or Guest to Host. In the simpler NAT mode, I was only able to ssh from my Host to Guest but not ssh from Guest to Host.
Most of the time I used the Bridge Network Mode in Virtual Box, then I switched from Virtual Box to KVM/qemu/libvirtd. Thanks to virt-manager or Gnome Boxes it was relatively easy to use those tools instead of VirtualBox.
When using virt-manager, you have the option to setup the network mode from GUI. Unfortunately there is not an option in drop box to select Network bridge. Instead we have to create the bridge from command line.
Use Network Manager Client to create “br0” bridge interface, in my case the physical Ethernet network interface (From a ThinkPad docking) is: enp0s20f0u2u1i5.
sudo nmcli con add ifname br0 type bridge con-name br0
sudo nmcli con add type bridge-slave ifname enp0s20f0u2u1i5 master br0
Bring down physical Ethernet interface and bring up bridge br0.
sudo nmcli con down "Wired connection 1"
sudo nmcli connection up br0
Setup xml file to be used by virsh:
Use virsh:
sudo virsh net-define ./kvm_br0.xml
sudo virsh net-start br0
sudo virsh net-autostart br0
You will see bridge br0 in drop box from virt-manager
My host now has IP Address 192.168.0.104
My KVM guess has IP Address 192.168.0.105. Then, thanks to bridge I can ssh from Ubuntu Guest to my Fedora Host!
You can modify bridge IP address manually as well. For instance, use:
This post will show how to install node.js and npm from github repositories. Then typescript will be installed.
First thing clone repo from node.js
git clone https://github.com/nodejs/node.git
Then, as per instructions to build run ./configure and the make -j4
Prerequisites are python, gcc and make.
This will create node command. What is not clear is how npm is inside this distro.
node pre-release
npm is the package manager and it is only a js file run by node. This is included in node distribution,
npm client
but there is also a npm github site. What I found is that only documentation is built with npm git repository.
git clone https://github.com/npm/cli.gi
However, I used this command to install npm:
curl -L https://www.npmjs.com/install.sh | sh
Once I built node binary I copied to my ~/bin folder and then to my path.
npm uses 2 locations to set up libraries. Global and local. I’m new to node so I understood that local is project specific. In the other side, global it means only one path.
For instance, from my own build, global is set to:
Global prefix
In this prefix main modules should be installed:
npm global modules
Finally to install typescript use npm to install it. In this sample, I’m using npm from GitHub Repo. This is pointing to different global prefix.
npm from node npm_cli ..
This is weird. I think I’ll use npm version I installed with curl.
In an additional note, npm from Git repo did not work at all. I had to change npm script to fix path issue with npm_cli.js. Once I did that change npm worked: