Lingkang Zhang

I build robots. I am the Interim CTO at A&K Robotics, where I lead the R&D of various types of autonomous robots and self-driving vehicles. Before, I was the CEO and co-founder of InspiRED Robotics that creates machine vision solutions for drones. I studied Human Robot Interaction in Simon Fraser University Autonomy Lab with Richard Vaughan. Check out my Youtube page for my latest personal robot videos, and my LinkedIn for my professional projects videos.

  • Github
  • Robots
  • Education
  • Publications
  • Research
  • Competitions
  • Robots:

    HRC Model 4

    A 6-DOF robotic arm equipped with a depth camera, a force-feedback-enabled gripper and a teleoperated system designed for precision tasks. It’s built in 2023.

    HRC Model 0

    A 1.9-foot, 11-DOF bipedal robot that is equipped with an Inertial Measurement Unit (IMU) and achieved self balancing and bipedal walking with a simple yet effective PID based controller. It’s built in 2022.

    DIY 3D Lidar

    A DIY 3D Lidar made with a 2D Lidar, a servo motor and a few 3D printed parts with a total cost around $100. The performance is verified by performing 3D mapping using the LOAM software. It’s built in 2021.

    Quadruped Robot ChiTu II

    A ROS-enabled quadruped robot actuated by 12 9g-level metal servo motors. It is an upgraded version of Quadruped ChiTu with servo motors that doubles the torque and adaptive foot 3D-printed with carbon fiber filled nylon. It is equipped with IMU and controlled with Rapberry Pi 4B. It’s built in 2021.

    Quadruped Robot ChiTu

    A ROS-enabled quadruped robot actuated by 12 9g-level servo motors, essentially an upgraded version of Quadruped Yuki Mini. It is equipped with IMU and controlled with Rapberry Pi 4B. Besides dynamic trotting gait, it can also crawl on uneven terrain thanks to the pose sensing of the IMU. It’s built in 2021.

    Quadruped Robot Yuki Mini

    A ROS-enabled quadruped robot actuated by 12 9g-level servo motors. It is equipped with IMU and controlled with Rapberry Pi 4B. Besides dynamic trotting gait, it can also crawl on uneven terrain thanks to the pose sensing of the IMU. It’s built in 2021.

    Quadruped Robot Yuki

    A ROS-enabled quadruped robot built with very powerful 46kg.cm servo motors. It is equipped with IMU and controlled with Nvidia Jetson Nano. It’s built in 2020.

    Quadruped Robot Tsuki Mini

    A ROS-enabled super mini and low-cost(<$100) quadruped robot. It's built in 2020.

    Quadruped Robot Tsuki

    A ROS-enabled highly-dynamic quadruped robot. A brand new design that is stronger, faster and more robust than Quadruped Kaze. It’s built in 2019.

    Quadruped Robot Kaze

    A ROS-enabled heavy-load quadruped robot built with strong RDS 3135 servo motor. A simple python library is introduced for multiple servo motor control with no position feedback. It’s built in 2019.

    Quadruped Robot 9g

    A ROS-enabled quadruped robot built with low-cost (~$5) 9G servo motors. The challenge is to get the speed and position of 12 PWM servo motors under control with position feedback, and put ROS on an extremely small SBC. It’s built in 2018.

    (Not) Low-cost ROS Navigation Platform

    A ROS navigation-enabled (not) low-cost mobile robot equipped with depth camera.

    Low-cost ROS Navigation Platform

    A ROS navigation-enabled low-cost mobile robot equipped with laser scanner.

    inspiRED Humanoid Robot V3.0

    The third generation of inspiRED humanoid robot, a 50cm humanoid robot. Its controller is a low-cost but powerful single-board Linux computer Odroid-XU4 and we use ROS and Open-CV with it. It walks, barely.

    inspiRED Rover V1.0

    Mobile robot controlled by Raspberry Pi, equipped with a camera, two stereo speakers, a bumper and four IR distance sensors.

    inspiRED Humanoid Robot V2.0

    The second generation of inspiRED humanoid robot, a child-size (80 cm) humanoid robot. Its controller is a low-cost but powerful single-board Linux computer Odroid-U3 and we use ROS and Open-CV with it.

    inspiRED Humanoid Robot V1.0

    InspiRED is a LEGO-like 3D-printable humanoid robot platform featured with highly-customized components. Its controller is a low-cost but powerful single-board Linux computer Odroid-U3 and we use ROS and Open-CV with it. A large number of commonly used electronic equipments including RC servos, camera, HDMI-display, IR sensor, IMU, speaker, microphone, LED, PS2 controller and etc. are supported. We participated in the robotics startup competition, “Ogopogo’s Lair” and won funding support from NCFRN.

    Initial Humanoid Robot

    Initial is a 40-cm humanoid robot. It’s used for testing of running ROS on single board computer, Odriod U3, which talks to a low level controller board for controlling servo motors via serial port. It is also equipped with an HD camera, and the video stream can be transported to the laptop via wifi.

    Intelligent Navigation and Tracking Robot

    It was for Challenge Cup University Students Science and Technology Competition. I was the project leader and covered most of the programming and algorithms designing work. We designed and made the Intelligent Navigation And Tracking Robot in one year. And we ended up with winning the first prize in Beijing and was the only team in my university that gaining the qualification for national final 2013, where we won the second prize.

    Quadrotor Based on STM32 MCU

    I developed a quadcopter based on STM32 since 2013.11. A WLAN Model is attached to its CPU so that it can be controlled by a mobile phone via WLAN.

    Rover Based on ARM7

    I developed a rover based on ARM7 in 2013.02. It is controlled by LPC2148, and its position can be detected by using Zigbee equipments which is connected to a laptop which can display it.


    Education:

    • Master of Science, School of Computing Science, Simon Fraser University
    • Graduate Certificate in Science and Technology Commercialization, Beedie School of Business, Simon Fraser University
    • Bachelor of Engineering, School of Electronic and Information Engineering, Beijing Jiaotong University


    Publications:

    • Optimal Robot Selection by Gaze Direction in Multi-Human Multi-Robot Interaction
    • L. Zhang, R. Vaughan
      2016 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2016), Korea, October 2016. (PDF)

    • Optimal Gaze-Based Robot Selection in Multi-Human Multi-Robot Interaction
    • L. Zhang, R. Vaughan
      Human-Robot Interaction Pioneers Workshop at the 2016 ACM/IEEE International Conference on Human-Robot Interaction (HRI 2016 Workshop), Christchurch, New Zealand, March 2016 (PDF).

    • Orbiting a Moving Target with Multi-Robot Collaborative Visual SLAM
    • J. Perron*, R. Huang*, J. Thomas, L. Zhang, P. Tan, R. Vaughan
      Workshop on Multi-View Geometry in Robotics (MVIGRO) at the 2015 Robotics: Science and System Conference (RSS 2015 Workshop), Rome, Italy, July 2015 (PDF)

    • A Gaze-based Attention System for Multi-human Multi-robot Interaction
    • L. Zhang
      Master thesis (PDF)


    Research:

    Optimal Gaze-based Robot Selection in Multi-Human Multi-Robot Interaction

    This project presents a computer vision based system for interaction between multiple humans and multiple robots. Each human can “select” (obtain the undivided attention of) a robot by simply looking directly at it. This extends previous work whereby a single human can select one or more robots from a population. Each robot optimally assigns human identities to tracked faces in its camera view using a local Hungarian algorithm. Then the system finds the global optimal allocation of robot-to-human selections using a second, centralized, Hungarian algorithm. This is the first demonstration of optimal many-to-many robot-selection HRI.

    Cooperative Behaviour for Model Training with Autonomous Mobile Robots

    In order to gain the benefits of supervised learning techniques without requiring a human to construct a labeled dataset, we have developed a behavior for a pair of mobile robots to train a visual classifier on their own. This autonomous approach takes advantage of the rote nature of dataset creation to allow machines to replace humans with little appreciable loss of performance. A specific implementation of this behavior was written for Chatterbox robots, whose automatically-built dataset was compared to human-constructed ones.

    Orbiting a Moving Target with Multi-Robot Collaborative Visual SLAM

    Towards autonomous 3D modelling of moving targets, we present a system where multiple ground-based robots cooperate to localize, follow and scan from all sides a movingtarget. Each robot has a single camera as its only sensor, and they perform collaborative visual SLAM (CoSLAM). We present a simple robot controller that maintains the visual constraints of CoSLAM while orbiting a moving target so as to observe it from all sides. Real-world experiments demonstrate that multiple ground robots can successfully track and scan a moving target. (video)


    Competitions:

    ACM/ICPC 2012

    I studied the basic algorithms and take the programming races on Codeforces and other online platforms. I officially became a member of BJTU ACM/ICPC team in 2012. And I, together with my two teamates, took the Asian zone qualified of 37th ACM/ICPC in 2012, winning a bronze medal finally. (Solved problems collection are in the Algorithm category of this blog.)

    TECO Green Tech International Contest

    TECO Green Technology International Contest is a contest on environmentally-friendly technology organized by TECO Technology Foundation in Taiwan in 2013. Our team’s project, Indoor Localization and Navigation based on Zibee was enrolled in the Internation Final. (more about this)