Posts

Showing posts from 2017

[Caffe] How to use Caffe to solve the regression problem?

Image
There is a question coming up to my mind recently. How to use Caffe to solve the regression problem? We used to see a bunch of examples related to image recognition with labels and they are classification problem. In my experience, I have done this problem using TensorFlow, not Caffe. But, I think in theory they are both the same. The key point is using EuclideanLossLayer as the final Loss Layer and it's the detail from the official web site:

http://caffe.berkeleyvision.org/doxygen/classcaffe_1_1EuclideanLossLayer.html#details
"This can be used for least-squares regression tasks. An InnerProductLayer input to a EuclideanLossLayer exactly formulates a linear least squares regression problem. With non-zero weight decay the problem becomes one of ridge regression – see src/caffe/test/test_sgd_solver.cpp for a concrete example wherein we check that the gradients computed for a Net with exactly this structure match hand-computed gradient formulas for ridge regression. (Note: Caffe, a…

[Raspberry Pi] Use Wireless and Ethernet together

The following content is my Raspberry Pi 3's setting in /etc/network/interface as follows. In my case, I both use wireless and ethernet device at the same time.
# Include files from /etc/network/interfaces.d: source-directory /etc/network/interfaces.d auto lo iface lo inet loopback auto wlan0 allow-hotplug wlan0 iface wlan0 inet manual Wpa-conf /etc/wpa_supplicant/wpa_supplicant.conf allow-hotplug eth0 iface eth0 inet static address 140.96.29.224 netmask 255.255.255.0 up ip route add 100.85.0.0/24 via 140.96.29.254 dev eth0 up ip route add 140.96.29.0/24 via 140.96.29.254 dev eth0 up ip route add 140.96.98.0/24 via 140.96.29.254 dev eth0
But, this situation will encounter routing problem like this ( There are two default gateways):
I only want the 2nd one.
pi@raspberrypi:~ $ route Kernel IP routing table Destination Gateway Genmask Flags Metric Ref Use Iface default 140.96.29.254 0.0.0.0 UG 202 0 0 eth0…

[Debug] Debugging Python and C++ exposed by boost together

During the studying of Caffe, I was curious about how Caffe provides Python interface and what kind of tool uses for wrapping. Then, the answer is Boost.Python. I think for C++ developer, it is worth time to learn and I will study it sooner. In this post, I want to introduce the debugging skill which I found in this post and I believe these are very useful such as debugging Caffe with Python Layer. Here is the link:
https://stackoverflow.com/questions/38898459/debugging-python-and-c-exposed-by-boost-together

And it provides a useful approach to deal with this kind of debugging Python and C++ together as follows: http://www.boost.org/doc/libs/1_61_0/libs/python/doc/html/faq/how_do_i_debug_my_python_extensi.html

I use its way to try debugging a little bit and it works well. Here is a simple example using Caffe Python.

$ gdb python (gdb) target exec python (gdb) run >>> import caffe >>> [C-c] # Ctrl + C (gdb) break caffe::set_mode_cpu() # Please refer to $(CAFFE_ROO…

[PCIe] lspci command and the PCIe devices in my server

Image
The following content is about my PCIe devices/drivers and the lspci command results.
$ cd /sys/bus/pci_express/drivers $ ls -al drwxr-xr-x 2 root root 0  7月  6 15:33 aer/ drwxr-xr-x 2 root root 0  7月  6 15:33 pciehp/ drwxr-xr-x 2 root root 0  7月  6 15:33 pcie_pme/

$ cd  pcie_pme $ ls -al


$ lspci | grep 00:1c


Or $ cd /sys/bus/pci_express/devices $ ls -al


$ cd 0000:00:1c.0:pcie01 $ ls -al total 0 drwxr-xr-x 3 root root    0  7月  5 08:56 ./ drwxr-xr-x 6 root root    0  7月  5 08:56 ../ lrwxrwxrwx 1 root root    0  7月  6 15:51 driver -> ../../../../bus/pci_express/drivers/pcie_pme/ drwxr-xr-x 2 root root    0  7月  6 15:51 power/ lrwxrwxrwx 1 root root    0  7月  6 15:51 subsystem -> ../../../../bus/pci_express/

[Caffe] Install Caffe and the depended packages

Image
This article is just for me to quickly record the all the steps to install the depended packages for Caffe. So, be careful that it maybe is not good for you to walk through them in your environment. ^_^

# Install CCMAKE $ sudo apt-get install cmake-curses-gui
# Build my own installation location $ mkdir ~/local_install # ProtoBuffer $ tar zxvf protobuf-2.5.0.tar.gz
$ cd protobuf-2.5.0
$ ./configure --prefix=/home/liudanny/local_install/
$ make -j2
$ make install
#Boost $ tar xvf boost_1_56_0.tar.bz2
$ cd boost_1_56_0
### ./bootstrap.sh --show-libraries ###
$ ./bootstrap.sh --with-libraries=program_options,filesystem,system,exception,thread
$ ./b2
$ cp -r boost/ /home/liudanny/local_install/include
$ cp stage/lib/* /home/liudanny/local_install/lib/
# Gflags $ unzip gflags-2.1.1.zip
$ cd gflags-2.1.1
$ mkdir build
$ cd build
$ cmake ..
$ ccmake ..

$ make -j2
$ make install
# Glog $ tar zxvf glog-0.3.3.tar.gz
$ cd glog-0.3.3
$ ./configure --prefix=/home/liudanny/local_install
$ make -j2
$…

[NCCL] Build and run the test of NCCL

NCCL requires at least CUDA 7.0 and Kepler or newer GPUs. Best performance is achieved when all GPUs are located on a common PCIe root complex, but multi-socket configurations are also supported.

Note: NCCL may also work with CUDA 6.5, but this is an untested configuration.

Build & run To build the library and tests.

$ cd nccl
$ make CUDA_HOME=<cuda install path> test
Test binaries are located in the subdirectories nccl/build/test/{single,mpi}.

$ ~/git/nccl$ export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:./build/lib
$ ~/git/nccl$ ./build/test/single/all_reduce_test 100000000
# Using devices
#   Rank  0 uses device  0 [0x04] GeForce GTX 1080 Ti
#   Rank  1 uses device  1 [0x05] GeForce GTX 1080 Ti
#   Rank  2 uses device  2 [0x08] GeForce GTX 1080 Ti
#   Rank  3 uses device  3 [0x09] GeForce GTX 1080 Ti
#   Rank  4 uses device  4 [0x83] GeForce GTX 1080 Ti
#   Rank  5 uses device  5 [0x84] GeForce GTX 1080 Ti
#   Rank  6 uses device  6 [0x87] GeForce GTX 1080 Ti
#   Rank  7 uses de…

[Mpld3] Render Matplotlib chart to web using Mpld3

Image
The following example is about rendering a matplotlib chart on web, which is based on Django framework to build up. I encountered some problems before, such as, not able to see chart on the web page or having a run-time error after reloading the page. But, all the problems are solved.

<< demo/views.py>> import matplotlib.pyplot as plt
import numpy as np
import mpld3

def plot_test1(request):
context = {}
fig, ax = plt.subplots(subplot_kw=dict(axisbg='#EEEEEE'))
N = 100

"""
Demo about using matplotlib and mpld3 to rendor charts
"""
scatter = ax.scatter(np.random.normal(size=N),
np.random.normal(size=N),
c=np.random.random(size=N),
s=1000 * np.random.random(size=N),
alpha=0.3,
cmap=plt.cm.jet)
ax.grid(color='white', linestyle='solid')

ax.set_title("Scatter Plot (with tooltips!)", size=20)

labels = ['point {0}'.format(i + 1) for i in range(N)]
tooltip = mpld3.plugins.PointLabelTooltip(scatter, labels=label…

[Hadoop] To build a Hadoop environment (a single node cluster)

For the purpose of studying Hadoop, I have to build a testing environment to do. I found some resource links are good enough to build a single node cluster of Hadoop MapReduce as follows. And there are additional changes from my environment that I want to add some comments for my reference.

http://www.thebigdata.cn/Hadoop/15184.html
http://www.powerxing.com/install-hadoop/
Login the user "hadoop" $ sudo su - hadoop
Go to the location of Hadoop $ /usr/local/hadoop
Add the variables in ~/.bashrc export JAVA_HOME=/usr/lib/jvm/java-7-openjdk-amd64 export HADOOP_HOME=/usr/local/hadoop export HADOOP_CONF_DIR=$HADOOP_HOME/etc/hadoop export HADOOP_INSTALL=/usr/local/hadoop export PATH=$PATH:$HADOOP_INSTALL/bin export PATH=$PATH:$HADOOP_INSTALL/sbin export HADOOP_MAPRED_HOME=$HADOOP_INSTALL export HADOOP_COMMON_HOME=$HADOOP_INSTALL export HADOOP_HDFS_HOME=$HADOOP_INSTALL export YARN_HOME=$HADOOP_INSTALL
Modify $JAVA_HOME in etc/hadoop/hadoop-env.sh export JAVA_HOME=/usr/lib/jvm/java-…

[Spark] To install Spark environment based on Hadoop

This document is to record how to install Spark environment based on Hadoop as the previous one. For running Spark in Ubuntu machine, it should install Java first. Using the following command is easily to install Java in Ubuntu machine.

$ sudo apt-get install openjdk-7-jre openjdk-7-jdk
$ dpkg -L openjdk-7-jdk | grep '/bin/javac'
$ /usr/lib/jvm/java-7-openjdk-amd64/bin/javac

So, we can setup the JAVA_HOME environment variable as follows:
$ vim /etc/profile
  append this ==> export JAVA_HOME=/usr/lib/jvm/java-7-openjdk-amd64

$ sudo tar -zxf ~/Downloads/spark-1.6.0-bin-without-hadoop.tgz -C /usr/local/
$ cd /usr/local
$ sudo mv ./spark-1.6.0-bin-without-hadoop/ ./spark
$ sudo chown -R hadoop:hadoop ./spark

$ sudo apt-get update
$ sudo apt-get install scala
$ wget http://apache.stu.edu.tw/spark/spark-1.6.0/spark-1.6.0-bin-hadoop2.6.tgz
$ tar xvf spark-1.6.0-bin-hadoop2.6.tgz
$ cd /spark-1.6.0-bin-hadoop2.6/bin
$ ./spark-shell

$ cd /usr/local/spark
$ cp ./conf/spark-env.sh.temp…

[picamera] Solving the problem of video display using Raspberry Pi Camera

When I tried to use Raspberry Pi Camera to display video or image, I encountered a problem that there is no image frame and the GUI showed a black frame on the screen. It took me a while to figure out this issue.
    After searching the similar error on the Internet, I found it is related with using picamera library v1.11 and Python 2.7. So I try downgrading to picamera v1.10 and this should resolve the blank/black frame issue:

The linux command is as follows:
$ sudo pip uninstall picamera
$ sudo pip install 'picamera[array]'==1.10

So, it seems there are some issues with the most recent version of picamera that are causing a bunch of problems for Python 2.7 and Python 3 users.

[Kafka] Install and setup Kafka

Image
Kafka is used for building real-time data pipelines and streaming apps. It is horizontally scalable, fault-tolerant, wicked fast, and runs in production in thousands of companies.



Install and setup Kafka
$ sudo useradd kafka -m
$ sudo passwd kafka
$ sudo adduser kafka sudo
$ su - kafka
$ sudo apt-get install zookeeperd


To make sure that it is working, connect to it via Telnet:
$telnet localhost 2181
$ mkdir -p ~/Downloads
$ wget "http://mirror.cc.columbia.edu/pub/software/apache/kafka/0.8.2.1/kafka_2.11-0.8.2.1.tgz" -O ~/Downloads/kafka.tgz
$ mkdir -p ~/kafka && cd ~/kafka
$ tar -xvzf ~/Downloads/kafka.tgz --strip 1
$ vi ~/kafka/config/server.properties

By default, Kafka doesn't allow you to delete topics. To be able to delete topics, add the following line at the end of the file:
⇒ delete.topic.enable = true

Start Kafka
$ nohup ~/kafka/bin/kafka-server-start.sh ~/kafka/config/server.properties > ~/kafka/kafka.log 2>&1 &

Publish the string "Hel…

[InfluxDB] Install and setup InfluxDB

Image
Download the source and install $ wget https://s3.amazonaws.com/influxdb/influxdb_0.12.1-1_amd64.deb
$ sudo dpkg -i influxdb_0.12.1-1_amd64.deb
Edit influxdb.conf file $ vim /etc/influxdb/influxdb.conf


Restart influxDB $ sudo service influxdb restart
   influxdb process was stopped [ OK ]
   Starting the process influxdb [ OK ]
   influxdb process was started [ OK ]

$ sudo netstat -naptu | grep LISTEN | grep influxd
tcp6       0      0 :::8083                 :::*                    LISTEN      3558/influxd  
tcp6       0      0 :::8086                 :::*                    LISTEN      3558/influxd  
tcp6       0      0 :::8088                 :::*                    LISTEN      3558/influxd

Client command tool  $influx
> show databases

[OpenGL] Draw 3D and Texture with BMP image using OpenGL Part I

Image
It has been more than half of year not posting any article in my blogger and that makes me a little bit embarrassed. Well, for breaking this situation, I just quickly explain a simple concept about OpenGL coordinate.

Before taking an adventure to OpenGL, we have to know the coordinate in OpenGL first. Please check out the following graph. As we can see, the perspective of z position is pointed to us and it's so different from OpenCV.



If we take a look closer, the following OpenGL code can be explained in the picture below:


glBegin(GL_QUADS) # Start Drawing The Cube

# Front Face (note that the texture's corners have to match the quad's corners)
glTexCoord2f(1.0, 0.0); glVertex3f(-1.0, -1.0, 1.0) # Bottom Left Of The Texture and Quad
glTexCoord2f(1.0, 1.0); glVertex3f( 1.0, -1.0, 1.0) # Bottom Right Of The Texture and Quad
glTexCoord2f(0.0, 1.0); glVertex3f( 1.0, 1.0, 1.0) # Top Right Of The Texture and Quad
glTexCoord2f(0.0, 0.0); glVertex3f(-1.0, 1.0, 1.0) # Top Left Of Th…