Grasshopper

algorithmic modeling for Rhino

Information

HAL Robotics Framework

Today is a very exciting day for all of us here at HAL Robotics.
It has been a long two years in the making but we are proud to be finally taking the first public step towards the 

1.0 release of the HAL Robotics Framework.

Ready for testing and available to download today are the 1.0-beta versions of the Core feature-set and the Grasshopper client application.
 
This means that from right now you will be able to test and break work-in-progress versions of:

  • Accurate real-time simulation 
  • Trajectory diagnosis including detection of out-of reach and singular positions
  • Motions specified in Cartesian or joint spaces
  • Simulation of blended motions
  • Online preset catalogs accessible from within Grasshopper, of robotstools and controllers ready to drop in to your scripts
  • Program translation to ABB RAPIDKUKA KRL or Universal Robots URScript
  • Lightweight installer which allows you to add, remove and update extensions to the framework with ease
  • Cloud-based licenses to share between you own computers or your whole organisation
  • User interface adapting to the level of control you require

TRY NOW!

With that we shall let you get started but don’t forget, this is public now so by all means spread the word to anyone you think may also be interested in testing out this new toolkit.

 
Happy programming,
The HAL Robotics Team

Website: http://hal-robotics.com
Location: London
Members: 105
Latest Activity: Sep 20, 2023

GETTING STARTED

The first thing you'll need to do to get going is create an account on our WIP user-space. This will give you access to a trial and serve as an access to your licenses, either personal or shared. Once you create your account and verify your email address you'll find the link for the installer, which you are now free to download, and a couple of pages that may come in handy if you need to administer your licenses or organisational memberships. It's now time to install the framework. There is no need to uninstall your previous HAL Robot Programming & Control installation if you have one, the framework will run in parallel to help you migrate your existing projects. 

Once you've run the installer and selected the extensions you want to install I suggest you head over to our Getting Started playlist on YouTube which will talk you through the basics of the interface, how the new components work and what goes where to create your first executable toolpath. We will be adding more detailed tutorials over the next few weeks so please subscribe (or just check back there every so often) for walkthroughs, guides, tips and tricks directly from the team that created the software.

Discussion Forum

Configuring Controller

Hi - just installed Hal and trying to connect to my IRC5 controller for an ABB 120. The simulation is working fine but it doesn't seem to be speaking to the physical controller. I am assuming the…Continue

Started by James Charlton Nov 25, 2020.

How to sync HAL with Firefly/Arduino?

Hey There. Thank you guys for developing HAL. Things are much easier using it!A quick question: is there any way to sync Arduino/Firefly with HAL?I have a tool attached to my robot and I need to…Continue

Tags: IRC5, Arduino, ABB, Firefly, Robot

Started by Mehdi FarahBakhsh May 15, 2019.

Changing status bit on Kuka robots 2 Replies

Hi Thibault, I am having problems changing the status bit in HAL. I changed the Flip settings, which works fine in the simulation but nothing is changed in the code generation. How can I make the…Continue

Tags: HAL

Started by Erik Parr. Last reply by Erik Parr Oct 7, 2015.

Spline Movement 2 Replies

Hi Thibault,I'm tyring out the simple path example. Is there anyway I can generate Spline movement command for kuka robot? It's much faster than Line movement command especially when there are many…Continue

Started by Kada Liao. Last reply by Kada Liao Aug 7, 2014.

Comment Wall

Add a Comment

You need to be a member of HAL Robotics Framework to add comments!

Comment by Thibault Schwartz on July 11, 2013 at 8:34

Dear All,

The first Alpha version of HAL 005, cross-compatible with ABB, Universal Robots and KUKA is now available. If you are interested to participate to the testing, please send me a PM.

Thibault.

Comment by Thibault Schwartz on April 6, 2013 at 10:52

HAL 004.5 Update available

LOG HALv0.04.5
--------------------------------------------------------------------------------------------------------------------------------------
General updates:
• The FindMAC.gh license requesting utility now allows you to generate an automatic email embedding your data.

Robot Library:
• New IRB4400(60) robot preset.
• New IRB6400Rex(200) robot preset.
• New IRB6620(150) robot preset.
• New IRBT6004 track preset.
• New Track Creator component.

Base Pack:
• The Inverse Kinematics Solver now handles inputs for track positioners (linear external axis).
• The Inverse Kinematics Solver now handles flip overrides to access alternative configurations of the elbow, of the wrist, or both.
• New control in the right-click menu of the Inverse Kinematics Solver allowing to change the threshold for large reorientation analysis.
• The Rapid Generator was outputing wrong confdata for axis with a rotation between +270 and +360°, this is solved.
• The Rapid Generator was outputing wrong local rotation values for targets when a Work Object was used under certain conditions, this is solved.
• The Rapid Generator is now outputing the declaration of the Work Object when it is different than WObj0.
• New controls in the right-click menu of the RAPID Generator allowing to set custom precisions for coordinates and rotation values to be used
in the code, in order to optimize the size of the program.
• New control in the right-click menu of the RAPID Generator allowing to force the formating of the output for multi-robot setups.
• New Track Position Solver component.
• New External Axis Manager component.
• New Signals Manager component.
• New Flip Value List component.

Communication Pack:
• The HAL To Controller component now allows to manage Digital Outputs (DO) and simulated Digital Inputs (DI) signals in real-time.
• The OSC To HAL component is now compatible with remote digital signal management.
• The OSC To HAL component is now automatically detecting the device (iPhone or iPad) you are using.
• The TouchOSC layout has a new Digital I/Os management menu (2*15 channels on the iPad, 2*6 channels on the iPhone).

Comment by Thibault Schwartz on March 14, 2013 at 14:09

Dear Youngjae,

Just to specify a little: the IK solver only solves the position of the robot on specific targets. Interpolations between targets are processed directly by the controller, not by the software IK. For the moment, the "Toolpath Interpolation" component of HAL (which can be handy to predict reorientations between targets), is limited to the simulation of linear (MoveL) interpolations. I am currently working on the minimal rotation interpolation as well (equivalent of MoveAbsJ), but it's not ready yet.

Concerning Galapagos: HAL is already fully compatible with multi-robot setups so if you want to test a bunch of similar setups with Galapagos and find the best one, you can already do it. But for the configuration management, it's another story: you cannot access it (it is not public since most of the time the correct configuration is the one which minimize the 4th axis rotation). I can make it accessible, in a way that you can add "rules" for the solving (so basically, a list of booleans to force the flip of the elbow and of the wrist). In the new version I am working on, the elbow automatically reorient itself depending on the robot base plane, so that if you have two robot working together, with one on the ground and one on the ceiling, there is no problems of kinematics solving (there is one in the current version). In fact I also introduced it for linear external axis as well so you can have tracks anywhere in space.

For interacive system, I updated the OSC to HAL component and the touchOSC layout so that you can now do real-time calibration of hotwire cutters with your phone. More little applications like this will come soon.

Comment by yj on March 14, 2013 at 12:37

Dear Thilbault, thank you for the explanation in depth.

That is great to hear that you are also interested in deleveloping customized and interactive system that compliment robotics, which is already a state of art technology.

As far as I know, rapid is an older program language that has pros and cons, pros including interesting features that works great for Robotics. For instance I didnt know about MoveAbsJ until you mentioned it.


I realized the certain limitation of galapagos to reach deepder into other grasshopper components in general for it to be really powerful. I agree that this is not a trivial issue at all, which I also see in the development of Kangaroo or karamba, which in a way calls for either an API or a dynamic plug-in approach, as VVVVers would call it. But it seems necessary at some point in order to take the full advantage of your extensive plug ins you have built already. I mean the software that I would need to generate RAPID code to operate ABB costs 1200 euros a year and you are offering something that is a fraction of the cost.

As you hightlight many times, it is probably more important to quickly find an accessible configurations rather than finding all other parallel configurations, esp in certain singular points, this can cause the system to crash when there might be almost infinite possibilities. It is also very practical when someone asks if I can move the TCP from here to there on the spot, to be able to come up with one path that seems functional right away. we dont have quantum computers that can consider all possibilities at the same time, well there is one but it is controversial if it is really.

Initially I thought one of the key issue is the possibility for the user to evaluate and choose between different Interpolation methods (Joint, Linear, circular interpolations). Or the possible to mix those interporations in one routine that divides into multiple sub routines which might work to deal with each axis of the robot arm having distinct limitations that either cause problem to the robot or stops the robot. But honestly I dont understand how your plug in works enough to really comment on some of the issues you mention.

But just recently found a work around suggested by Daniel Piker on Kangaroo, on working with Galapagos as you can read here. http://www.grasshopper3d.com/group/kangaroo/forum/topics/reset-simu...

He feeds a list of true and false booleans with hoopsnake into kangaroo that gives him a branch of results rather than one. So maybe in a complicated way I am asking if it is possible for the Inverse Kinematic Solver to take in a list of Interpolation modes rather than one. I need to look again how the IK solver works ... If I get some where I will post a screen shot and a definition...

Lastly my idea for the application of depth sensing is not exactly what you have in mind... but maybe with sometime I can show the idea better. A great lesson learned about robotics thanks to you..

Comment by Thibault Schwartz on March 10, 2013 at 10:05

Dear Youngjae,

As a matter of fact, I also have been working on such applications since last August and HAL now allows to stream positions in almost realtime from any device or software connected to grasshopper (only compatible with IRC5 controllers). I made some tests with kinect and phones and tablets and it works (so if you have a good position for your kinect you can already know when a user is too close to the robot and stop the execution or slow it), but due to controller limitations I am now working on a different way of sending and managing data to the robot to minimise the latency of the system.

Galapagos will not allow you to switch between configurations and toolpaths, since configurations are computed by the IK solver and managed by several informations in the code, that can only be overrided or changed depending on the interpolation you use (MoveJ/MoveL/MoveAbsJ etc.). And once again, some configurations are not reachable depending on the rotation domains of certain joints (4th one for example) or also because linear interpolations cannot work for targets necessiting more than 90° of rotation. HAL computes by default the most "accessible" configurations in order to minimize 4th axis flip (which is a pain), and the next update will have a fix to allow to count the laps you do with the joints allowing more than 360° of rotation in order to prevent to reach the max values (otherwise the robot is locked and the application is stopped), there is a little bug on the 6th axis on the current version. IMHO these questions are much more important to solve for the design of your application than the approximaton of the workspace (it is very easy to measure the max radius of rotation, and singularities can always been reached using moveAbsJ).

By the way, all those things are not exactly trivial to solve (some are with the new verson of HAL, but not all of them), so depending on how far you need to go, I hope you don't have a deadline soon...

Comment by yj on March 10, 2013 at 9:34

first off, yes i agree dont need hoopsnake really unless i wanted to share the definition with others who doesn do c# and make the logic of making toolpath somehow more accessible. honestly, i am not a computer science major and learning c# will take some time for me, but drawing a spiral i can do.

the reason for this exercise is not necessarily production in mind. it is a for an interactive robotic installation with a special process where a gesture of a user is translated as coordinates as you can see in the following links.

interactive robotic installation

http://www.youtube.com/watch?v=A-Baf55hSo4

outdoor robotic installation and the question of workspace

http://www.youtube.com/watch?v=P1832F7KDtU

MESO created a process where their custom platform that sends coordinates from an input via touchscreen from a visitor that lets the tooltip of the robot know where to be for the next 40 milliseconds at a time. this is really not something robotic engineer do or like for the reason such a system cannot be really guranteed. so MESO had to build their custom chasis roughly approximating the workspace with some of their aesthetic sense. on the second one, instead of modelling such a complex space that as you said may leads to a beautiful picture, they used a bubble because that was all that was needed.

of course, it would be very useful to be able to just model some obstacles such as workbench and the material dimension beyond the tools if case the robot is handling parts, and maybe try to use galapagols to be able to generate feasible toolpaths, given the configuration singularities.http://www1.adept.com/main/ke/data/procedures/singularity/singulari...

which would be useful esp. if the next task or prior task is effect by certain configuration of the robot when that particular task is done or given the material dimension of the object the robot is handling given the space around the robot, or if there are multiple robots in proximity.


or it would be even more interesting if it is possible to model a depth sensor's sensory boundary on a custom tool, which then raises the possibility of designing new customized tools for abb robots, for one that can detect a change in its workspace and stop automatically when a foreign object enters the workspace. in that case, it would be useful to have that workspace modelled so that the motion sensor is able to separate the movements of the robotic arms from the movements, something like z buffer, that are not from the robot.

I have work with kinect quite a bit recently with vvvv and have some understanding of processing the z buffer from its IR sensor, sampling and holding z buffere change, for example. with their new release in 64 bit support for kinect nodes, the amount and speed of sensor technology is improving quite fast.

its just an idea i would like to prototype. its an idea of intelligent extension of a robotic that actually has an ability to influence the toolpath in a dynamic and interactive way real time, of course with a safety in mind. but first i need to model and simulate this idea, and thats why i am trying to understand the workspace a little better.

thank you very much for your feed back mr Schwartz.

Comment by Thibault Schwartz on March 10, 2013 at 7:20

Dear Youngjae,

I don't understand why you need hoopsnake if you know C#. What I know for sure is that joints are not always rotating 360° so you need to respect the rotation domains of every joint, and limit between -180 to 180° the ones that have more than 360° of rotation. In any case it will just give you a raw workspace which is not taking the orientation of the tool into account, so I am not sure the ouptut will be really usable. Most end-effectors have constraints inherent to their geometry, coming from the material they process, wiring, from the workbench supporting the parts etc., so I am not sure this exercise will help you a lot, I am afraid it will just make a beautiful picture but will not be very useful for your application development.

Maybe I am wrong of course, but this is just the feeling I have of it.

Comment by yj on March 9, 2013 at 11:35

Hi mr Schwartz, thank you for the quick reply also the definition.

1) FK solver + the galapagos or

2) IK solver + some mathemtical functions with either grasshopper native components or c# + hoopsnake seems to be the options I can see.

with hoopsnake, I would like to set each sub axis to rotate 360 degrees while the axis directly one level up in the hierarchy will move 1 degree. repeat this step all the way down to the axis on the base will give me some idea of the physical space that is the robots workspace. but maybe the term workspace more specifically refers to the point the end of the tool can reach so I hope what I have written is not g!misleading.

the galapagos definition you sent me inspired me to look further into this topic. The primary goal is to model the workspace and singularity as if they were an object. I found a PhD work on workspace and singularity here.  http://www.europeana.eu/portal/record/2020801/9BB75131B39EE8536A258...

http://www.europeana.eu/portal/record/2020801/FDB9E3EBEA6DF5ADB318D...

anyways its saturday so maybe monday i will have somethin! thx again.

Comment by Thibault Schwartz on March 9, 2013 at 6:02

Dear Youngjae,

The FK and IK solvers are detecting singularities, so you can play with them to get what you want.

I attached a little example file showing a robot with a custom tool, and a simple recording of multiple positions in space that are not triggering collisions generated using galapagos and the FK solver (so, a raw approximation of the workspace).

Please note that workspace computing is covered by a lot of papers (math stuff), so if you know how to read those, you will get much better results by directly implementing some mathematical functions.

Workspace.gh

Comment by yj on March 8, 2013 at 8:46

Dear mr.Schwartz, is it possible to rotate one axis of the robot at a time with a custom tool tip? I would like to study the singularity of the robotic arm orientations with other newer grasshopper plug ins in order to model the four dimentional boundary of a given robotic arm. 

 

Members (105)

 
 
 

About

Translate

Search

Photos

  • Add Photos
  • View All

Videos

  • Add Videos
  • View All

© 2024   Created by Scott Davidson.   Powered by

Badges  |  Report an Issue  |  Terms of Service