Skip to content

the moving objects tracking system via two axis camera motion (and as optionally n joint robotic arm) for raspberry pi distributions

License

Notifications You must be signed in to change notification settings

sevgiun/T_System

 
 

Repository files navigation

T_System

the (non-)moving objects tracking system via two axis camera motion and n joint robotic arm for raspberry pi distributions

Github contributors Github release Github stars

Badge Emoji License: MIT

Travis Read The Docs Coveralls Pyup shield Pyup python-3 shield Contributor Covenant

T_System


Supported Environments

Operating systems Linux
Python versions Python 3.x (64-bit)
Distros Raspbian
Package managers APT, pip
Languages English

Requirements

Hardware
  • Raspberry Pi 2,3 B, B + or higher
  • Raspberry Pi Camera
  • n servo motors
  • 2 axis as pan-tilt motions for t_system's locking target ability
  • n-2 axis for t_system's robotic arm feature (Optional)
Software
  • All requirement libraries automatically installing via installation scripts. To see these libraries look at here

Installation

Clone the GitHub repository and run

sudo ./install.sh

in the repository directory.

for development mode: sudo ./install-dev.sh

If there is a failure try sudo -H ./install-dev.sh

Usage

usage: t_system [-h] [--interface {official_stand,augmented,remote_ui,None}]
                [--stand-gpios RED-LED GREEN-LED FAN] [--host HOST]
                [--port PORT] [--debug] [-l] [-s]
                [--detection-model DETECTION_MODEL] [--cascades CASCADES] [-j]
                [--encoding-file ENCODING_FILE] [--use-tracking-api]
                [--tracker-type {BOOSTING,MIL,KCF,TLD,MEDIANFLOW,GOTURN,MOSSE,CSRT}]
                [--camera-rotation CAMERA_ROTATION]
                [--resolution WIDTH HEIGHT] [--framerate FRAMERATE]
                [--chunk CHUNK] [--rate RATE] [--channels CHANNELS]
                [--audio_device_index AUDIO_DEVICE_INDEX]
                [--shoot-formats VIDEO AUDIO MERGED] [--shot-format SHOT] [-x]
                [--sd-channels SD_CHANNELS] [--arm-name ARM]
                [--ls-gpios PAN TILT] [--ls-channels PAN TILT]
                [--AI AI | --non-moving-target | --arm-expansion] [-p]
                [--ap-wlan AP_WLAN] [--ap-inet AP_INET] [--ap-ip AP_IP]
                [--ap-netmask AP_NETMASK] [--ssid SSID] [--password PASSWORD]
                [--wlan WLAN] [--inet INET] [--static-ip STATIC_IP]
                [--netmask NETMASK] [--country-code COUNTRY_CODE]
                [--environment {production,development,testing}]
                [--no-emotion] [-S]
                [-m {single_rect,rotating_arcs,partial_rect,animation_1,None}]
                [-r] [-v] [--version]
                {id,remote-ui-authentication,encode-face,self-update,arm,live-stream,r-sync,log}
                ...

positional arguments:
  {id,remote-ui-authentication,encode-face,self-update,arm,live-stream,r-sync,log}
                        officiate the sub-jobs
    id                  Make identification jobs of T_System.
    remote-ui-authentication
                        Remote UI administrator authority settings of the
                        secret entry point that is the new network connection
                        panel.
    encode-face         Generate encoded data from the dataset folder to
                        recognize the man T_System is monitoring during
                        operation.
    self-update         Update source code of t_system itself via `git pull`
                        command from the remote git repo.
    arm                 Management jobs of Denavit-Hartenberg transform matrix
                        models of robotic arms of T_System.
    live-stream         Make Online Stream jobs of T_System.
    r-sync              Make remote synchronization jobs of T_System.
    log                 Make logging jobs of T_System.

optional arguments:
  -h, --help            show this help message and exit

For detailed output, look at Help

t_system user-interfaces {official_stand,augmented,remote_ui,None} is standard running command. official_stand, augmented and remote_ui are mentioned here, here and here as respectively.

Detailed usage available inside USAGE.md

Interfaces

Official Stand

Portable usage interface v0.6

Special thanks to Uğur Özdemir for the awesome design idea of this Stand.


  • Dependencies

    • Raspberry pi 4 model B/B+.
    • 2 pieces mg995, 3 pieces sg90 or mg90s servo motors.
  • Properties

    Has 1.125 times longer body and arm length than the previous version.

    • Tiny Camera

        8x8mm dimensions 8MP resolution micro camera.
      
    • IR led

        Automatically activatable IR led for advanced night vision. 
      
    • 1 switch key for on/off

        Cut the electiric current directly.
      
    • 4 pieces 18650 li-ion batteries

        Seri connected 2 pieces for feeding Raspbbery Pi and other seri connected 2 pieces for servo motor.
      
    • External Motor Driver

        12 bit 16 channel PWM servo driver with I2C communication.
      
    • Internal Cooler

        2 pieces 30x30x10mm micro fan and the aluminyum block for falling down the cpu temperature.
      
    • Local Network Management

        Scan the around networks. If there is no network connection become an Access Point and serve Remote UI ınternally.
      
    • Remote UI accessing

        No control by tapping. Accessing with Remote UI from mobile or desktop.
      

To see the old version explaining go here

Remote UI

The remotely controlling interface v1.2.7

  • Properties

    • Motion Control

        2 kind control type for the arm. 
            
            1: axis based control. move all axes separately.
            2: direction based control. move according the direction (up-down/forward-backward/right-left).
      
    • Scenario Control

        create scenarios by specifying arm positions and generating motion paths with them for behaving like a camera dolly.
      
    • Previewing / Monitoring

        watch the live video stream during creating scenarios and monitor what is it recording on working.
      
    • Network Control

        Add, update or delete Wi-Fi connection info.
      
    • Recognize People

        Add, update and delete the photos of the people for recognizing them. Choose one, more or all and recognize during the job.
      
    • Record Control

        Get preview or download the video records with the date based sorting system.
      
    • Live Streaming

        Start Live video streaming on available 8 different popular websites include Facebook, YouTube, Periscope and Twitch.
      

Powered by flask as an embedded framework. Available on mobile and desktop.

Augmented

Augmented usage explained here into the AUGMENTED.md.


Supported Distributions: Raspbian. This release is fully supported. Any other Debian based ARM architecture distributions are partially supported.

Contribute

If you want to contribute to T_System then please read this guide.

Please consider to support us with buying a coffee: Buy Me A Coffee

About

the moving objects tracking system via two axis camera motion (and as optionally n joint robotic arm) for raspberry pi distributions

Resources

License

Code of conduct

Stars

Watchers

Forks

Packages

No packages published

Languages

  • Python 62.0%
  • JavaScript 29.4%
  • CSS 4.1%
  • HTML 3.8%
  • Other 0.7%