I resume my work on thesis in the end of august by implementing advanced logging into my C++ projects.
Then, I wrote a python script so I have infrastructure to collect data from experiments and store them in my “experiments library“.
I made several experiments with various parameters of CACLA algorithm and Neural networks architecture (for Actor and Critic) to examine different behaviour.
I had also finished class ICubGrasping which is used to manipulate with iCub’s hands to perform operations which will be executed while grasping.
Now, I have to think out first basic architecture for simple object grasping task which I could implement and test on iCub.
Then, after I get gladsome results I can go further to develop more advanced and flexible architecture.
After I finished all my exams and took a rest for a couple of weeks I returned to my master thesis.
So what did I do in this period:
- added separate neural networks that acts as forward and inverse model, which both learns from data gathered during CACLA learning (to reach object).
- created a C++ class with a lot of helpful commands for iCub (methods for environment construction, body manipulation, etc.)
- created class which specializes on grasping (hand manipulation) with parametrized methods to hand flexion, extension and such things.
I plan to connect this grasping class to some learning algorithm (like CACLA) and experiment with it.
I also read more carefully an article about Infant grasp learning and found out lots of interesting biological information (maybe I can later compare my experiment with iCub to how human infants learn to grasp).
I found useful pictures of hand movement terminology.
These terms are essential to understand iCub’s hand and fingers DoF.
Then I’ve drawn basic diagram for object reaching and grasping which is inspired by the FARS model.
iCub has 6 touch sensors on each hand (5 are on the last part of fingers and 1 is on the palm).
In the iCub Simulator, output from these sensors is available at port /robotName/touch and it is streamed as a list of 12 values
Example of output:
[lpam] 0.0 [rpam] 0.0 [lind] 0.0 [lmid] 0.0 [lrng] 0.0 [llit] 0.0 [lthm] 0.0 [rind] 0.0 [rmid] 0.0 [rrng] 0.0 [rlit] 0.0 [rthm] 0.0
Where the first letter means left or right. The suffix specifies the part of the hand.
- pam – palm
- ind – index finger
- mid – middle finger
- rng – ring finger
- lit – little finger
- thm – thumb
We can also set whether the values should represent just (touch/no touch – boolean) data or also information about the pressure (double 0.0 – 1.0).
You can set this setting in the icub_parts_activation.ini file (pressure: on/off).
Right hand could be manipulated via the /robotName/right_arm/ port, the most important DoF are:
04 – wrist pronosupination
05 – wrist pitch
06 – wrist yaw
07 – hand finger adduction/abduction
08 – thumb opposition
09 – thumb proximal flexion
10 – thumb distal flexion
11 – index proximal flexion
12 – index distal flexion
13 – middle proximal flexion
14 – middle distal flexion
15 – ring & little flexion
Sources: iCub Simulator ReadMe, A Cognitive Robotic Model of Grasping (Macura et al.)
At the beginning of February I implemented neuralnetworks and reinforcement learning C++ libraries which I use in my iCub application.
The most of the time I spent by implementation of the application which uses the CACLA algorithm to train iCub to reach objects.
I’m also planning to create some UML package and class diagrams to easily represent how I had designed these libraries and applications.
You can take a look at the C++ source code documentation.
I made some testing of my CACLA library on task, where I tried to move point in 3D space to some target position.
I have also read…
- an article A Cognitive Robotic Model of Grasping, Macura et al, 2010
- a book An Introduction to Reinforcement Learning, Sutton and Barto, 1998
I subscribed to the RobotCub forum/mailing list – https://lists.sourceforge.net/lists/listinfo/robotcub-hackers
June 2022 M T W T F S S « Mar 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30
iCub on Twitter