picture:
... and on a PC without anything attached to the serial port. When you open the port, start the read component and its timer, do you then get a stream of <empty> values in the log output? (hmmm... I suppose that's only reasonable - but still, you are also seeing this?)
I suppose that, because of the mutually exclusive behavior of both the spider and grasshopper (i.e. only one at a time can access the COM port), we can deduce that we are listening on the correct port.
Am I listening on the correct pin (if such a notion makes sense at all)? If I look back to the spider software, I see that 9 channels are listed and that it's only the measured value on channel 0 that changes when I press the load cell. Channels 1, 2, and 3 report OVERFLOW; 4, 5, 6, and 7 are pretty much constant at 0.000 to 0.005 V; and channel 8 says FFFF. I do not know how things like that work so I do not know if they reflect reading from the 9 pins on the D-sub 9 connector.
As for your BTW question: no, I don't need to record all of the sensor values. I suppose that the Out value on the Read component will always reflect the most current value and that's all that I need to get on with life. In the end, the idea is that we have 4 load cells in the 4 corners of a plate onto which a vertical pipe is fixed. Loads are then put on the top end of the pipe and we'll have to visualize both direction and magnitude of the bending moment that is calculated from the compression - tension readings from the load cells... We've done this on a scaled model and streamed load cell information into MatLab. Now we'll have to use a different datalogger and I was hoping to be able to do the post processing in Rhino.
wim…
aph relaxation in 3D and more). There is much more already in our GitHub repos and more to be added. For getting an idea of our future direction check this lecture out. For getting a better understanding of graphs and graph theory watch this lecture and this lecture on a gamified spatial configuration process. Stay tuned for more and do not hesitate to post Python questions in the meantime.
ps. If you are having installation problems, please check the remedy suggested below:
Comment by Iman Sheikhansari on August 26, 2019 at 8:33amDelete Comment
HiIf you are encountering a problem with rhino 6 versions don't worryFollow these steps.1. Download SYNTACTIC from https://sites.google.com/site/pirouznourian/syntactic-design2. Install it and go to the installation folder, Drag & drop SYNTACTIC(green one) over your grasshopper canvas.3. Close your rhino and reopen it. 4. Type GrasshopperDeveloperSettings5. Tick the Memory load *.GHA assemblies using COFF byte arrays option6. Run grasshopper and enjoy plugin
…
ariations, but each seems to lack the sophistication to generate a ‘zip’ that retains its general shape over the whole curve.
Basically I’m trying to understand the process behind this: http://www.schindlersalmeron.com/index.php?option=com_content&task=view&id=27&Itemid=29
Here is an image of the latest definition.
1. I draw a curve in Rhino, and then define it in grasshopper. I also define the point as the beginning of the curve.
2. I offset the curve to a specified depth, based on structural member
3. I generate a line from the point at a tangent to the curve, then rotate it a
defined angle.
4. I find the intersection between the rotated line and the offset curve. Then generate a tangential line from this new point
5. Line is rotated at the same angle as before.
6. Process repeated.
The idea is to then generate a circle of defined diameter at each of the intersection points, then find the intersection of the circles with the curves, which are then joined up with straight lines to create the ‘zip’. This would mean a lot of copy-pasting and list management that I’m not really capable of with my limited grasshopper experience.
I had tried generating points at intervals along the curve and then eventually generating lines from one line to another with a shifted listed to form the tooth angle, but it wouldn’t retain its shape over the entirety of the curve.
Does anyone have any advice for how to tighten up this definition? I imagine that I will need to delve into vb.net scripting to address the recursive nature of the process.
I fear that I’m going about this in entirely the wrong way...
Of course the next step is to flatten out the curve for CNC manufacture.
Any help would be greatly appreciated! The potential for using grasshopper in design is amazing, and I would love to gain a deeper understanding of it!…
ort and export from the images below and also from the HELP file of DB in attachments (Page 71: Importing Geometric Data; Page 78-80: Import 3 - D CAD Data). In their HELP file, they mention about "import geometric data".
However, regarding the input of schedules, loads, constructions and etc., DB normally uses "Component " and "Template" (Page 29: Templates And Components; Page 591: Templates; Page 533: Components). "Templates" are databases of typical generic data, including Activity templates, Construction templates, Glazing templates, Facade templates, HVAC templates, Location Templates, and etc. "Component " are databases of individual data items (e.g. a construction type, material, window pane).
Both "Component " and "Template" are allowed to be imported and exported by using "Import / Export library data" command (.ddf format - DB Database File; Page 734: Import Components/Templates, Export Components/Templates). DB also allows us to build up our own libraries of templates and components (Page 731: Library Management; Page 733: Template Library Management).
In order to import both geometric information and other information related to schedules, loads, constructions and etc. from GH to BD, we supposed the following two ways:
1. GH(HB+GB) --> gbXML (both geometric and "Component " and "Template" information) --> DB
This is the way we most prefer. We did see information related to schedules, loads, constructions encoded in the gbXML file generated by GB, but still do not know the reason why DB did not take this information (I also mentioned this in Q6 within the gh file). We assume this might because the gbXML file we create encodes the schedules based on a different template / schema than the one DB expects. We also post this question to the DB forum for help.
(http://www.designbuilder.co.uk/component/option,com_forum/Itemid,25/page,viewtopic/p,13755/#13755)
2. GH(HB+GB) --> gbXML (geometric information only) + .ddf ("Component " and "Template" information only) --> DB
If the first way doesn't work and DB only takes geometric information from the gbXML, then we might think of the other way - generating the .ddf files from GH(HB+GB) to pass the schedule, load and construction information to DB.
I was wondering if it is feasible for HB and GB to have this function? And what is your suggestion to achieve this?
In addition, we notice that DB can export XML files (not gbXML), so we are trying to figure out if DB also accepts / reads the XML file. If so, we might be able to convert the gbXML (with both geometric and schedule information) to XML. What do you think about that?
Thank you again for all your help!
Best,
Ding
DB import
DB export
Template libraries
Component libraries
…
s para acercarse al diseño paramétrico.
El curso esta dirigido a arquitectos diseñadores e ingenieros de diseño que pretendan implementar las técnicas del modelado por parámetros dentro de sus herramientas de proyectación.
La duración de dicho curso es de 20 horas, repartidas en 6 sesiones los días lunes y miércoles de 5pm a 8:20pm, en el espacio cultural calle nueve (calle 9 # 43b-75 abajo del parque del Poblado. https://www.facebook.com/calle.nueve). El curso dará inicio el día lunes 22 de Agosto de 2011. El máximo de inscritos por curso es de 15 personas para garantizar la calidad de la enseñanza.
Este curso estará dictado por los arquitectos Ana Maria Bustamante Y David Vanegas arquitectos de la oficina de arquitectura interior137 (www.interior137.blogspot.com) que cuentan con más de dos años de experiencia en el manejo de GRASSHOPPER, y tienen una trayectoria reconocida como docentes en la Facultad de Arquitectura de la U.P.B.
Para participar en el taller los estudiantes deberán tener un computador portátil para su uso personal, durante todo el curso, además deben tener instalado el software Rhino versión 4.0 con la actualización SR9, y un conocimiento mínimo del modelado y la interfaz de este software.
Contenidos:
Sesión 1: * Introducción al modelado por parámetros y al diseño mediante algoritmos.
* Grasshopper: datos + acciones. Interface.
Sesión 2: * Datos fijos, datos variables: Parámetros.
* Puntos, Curvas parametrizables.
* Transformaciones: Mover, Rotar.
Sesión 3: * Datos múltiples (listas): Series. Rangos.
* Funciones de 1 y 2 variables.
Sesión 4: * Gestiones de datos en listas: seleccionar items, ordenarlos, desordenarlos, eliminarlos.
Sesión 5: * Atractores.
Sesión 6: * Superficies: creación de superficies, panelizaciones.
Informes e inscripciones:
Para inscribirse en el curso deberá reservar su cupo abonando el costo total del curso al menos hasta el miércoles 17 de Agosto. Este valor se devolverá totalmente únicamente en caso de cancelación del curso.
Para mayor información, póngase en contacto a través del correo electrónico interior137@gmail.com asunto: CURSO GH…
ugh information (whether coming from environmental analysis or any kind of database), extracting and managing informations for construction processes all require an understanding of data structures in order to build seamless design-to-construction pipelines. Through visual scripting in Grasshopper (Generative modeling plug-in for Rhinoceros) participants will learn how to build and develop parametric data structures (from basic simple lists to complex data trees), data-driven geometry and envelopes and how to extract relevant informations from such models for construction processes. Participants will also develop a personal envelope project and its full design-to-construction pipeline. [.]TopicsTheory: - Lecture: “Data Obsession” – computational designer as a new professional profile and the role of information and complexity in contemporary architectureTechnique: - Software interface - Components - Lists & Data Tree: management, manipulation, visualization - Geometry generation from data stream - Base exercises (Box morph, Image sampler, Floor sections, Attractor field, Multisection Pipe, Paneling) - Advanced exercise: Data-reactive component – data-reactive tessellation on NURBS surface. Data coming from environmental analysis or spreadsheet table - Advanced exercise: Data extraction from previous tessellation, visualization and storage in spreadsheets. - Advanced exercise: geometry optimization for construction[.]Software & skills:Basic modeling skills in Rhino are required. Participants should bring their own laptop with pre-installed software (software download links will be given after subscription).[.]Tutors:Alessio Erioli + Andrea Graziano – Co-de-iT (GH & design tutors).[.]Venue:The workshop venue will be:Polycollege WienJohannagasse 21050 Wienhttp://www.vhs.at/johannagasse.html[.]Calendar & Timetable:The workshop will have the following timetable throughout all the 4 days: 9:00-13:00 lesson+tutoring 14:00-17:00 lesson+tutoring[.]Subscription fees:For participants who register before 30/08/2012 we offer a EARLY BIRD feesE.B. – educational* : € 320 + VAT E.B. – professional: € 390 + VATafter 30/08/2012 will be in place the STANDARD fees:STANDARD fees – educational* : € 390 + VAT STANDARD fees – professional: € 490 + VAT* students, teachers, researchers & PhD (proof of status required).The deadline for registration is 06/09/2012; The workshop has a maximum of 30 places available and will be activated with a minimum number of 15 partecipants.[.]Application:To register please fill this FORM and send it via e-mail to:3ddreaming@gmail.com or ck@kkkc.at[.] Organized by:This workshop is organized by Co-de-iT in collaboration with:3d-dreaming.com – Architecture from a digital point of viewKKKC – Mediaware trading GmbH…
hat since we create a list of materials and we assign them to surfaces - volumes the next step could be to have an Life Cycle Analysis and Financial assessment produced.
The most common form to produce an LCA into a form that is commonly used and easily communicated is in the form of Environmental Product Declarations (EPDs) that follow ISO 14025:2006. As every form of LCA, EPDs raise a bunch of question regarding their boundaries and the accuracy of the results especially if we include the factor of location. In comparison with other LCA practices though, EPDs have to be followed by Product Category Rules (defining the boundaries of the study) that can be reviewed by external parties if the EPD is to go public. Part from that EPD results reflect each stage of the life cycle of a product including potential benefits from Reuse or Recycling. Finally if you have a system - for example a building - you can add the EPDs of the different subcomponents forming the building and get a final EPD for the building itself - the point where I think HB's functionality is fully aligned.
The financial assessment can easily be concluded if one has the price of the material he/she uses. Finally the environmental indicators of the EPDs (LCI, LCIA) can be translated into Shadow Costs (Shadow costs for Environmental Indicators here) and added to the final financial assessment as an option.
I have developed a similar plug-in (in C#) for Grasshopper for my master's thesis last year. The project focused on the comparison between constructing normally and constructing implementing Design for Deconstruction practices in steel buildings. The idea was to compare the two cases based on their environmental and financial performance. In the process I included also options for transportation of the material and for shadow cost, embodied energy and carbon assessment and more. The final outcome can be visualised in Rhino's viewports and exported to excel sheets. The plug-in is connected to local db with EPD data for steel profiles. The same scheme though can be followed for any type of material if we have the right database to connect it to!
Please have a look if interested at the report here! And let me know if you have any questions!
Please note that the report includes 3+ chapters dedicated to design for deconstruction practices e.t.c that are irrelevant with the topic but maybe interesting to read:)
Also if someone is interested in the report I can always send it to you.
(I will upload a video -runthrough of the plug-in later this week)
I would be very interested to have these capabilities in LB and HB and happy to help realising it!
Thanks
Tasos
…
ne diverse digital design methodologies and the use of different tools such as Autodesk Maya, Rhinoceros and Grasshopper.
Building up technical skills will provide the attendees with a solid platform from which to start rethinking and exploring innovative architectural ideas in collaboration with the team and the tutors.
URBAN FIELDS
Phase I
In the first part of the workshop attendees will be looking at field conditions and how to generate and design such fields that can help structure a possible urban condition in Florence.
We will be exploring dynamic systems, geometric systems and network theories to generate and design an abstract field condi- tion that extends the urban experience of the city onto the vertical dimensions of towers. Simple operations that would span variations from an initial state will give rise to high level of com- plexity.
The goal of this exercise is to create a rich and diversified intel- ligible urban space that can be later on subjected to local inter- ventions and zooming in to locally enhance each design.
AGENT - BODIES POLYMORPHISM
Phase II
The second part of the workshop will build upon first phase; par- ticipants will select one archetype (high rise tower) as a study model for further development.
Besides engaging with multi agent algorithms design strategies, attendees will address strategic utilisation of structurally and environmentally generated morphologies to design coherent and highly differentiated tower exo-skeletons.
Tutors will introduce agent-bodies polymorphism in order to explore the generation of structural aware and capable geom- etries through agent based formation of non-linear hierarchies and emergent patterns. These agent-bodies will operate in a complex spatial manner to form structure, partitions or enclo- sure and will operate across scales, creating a poly-scalar level of detail.
Attendees will speculate how autonomous systems can cre- ate new structures and intelligent distribution of structural elements, about new collaborative strategies of construction and the performativity they will evoke (performance, effects, responsiveness, interaction).
Fees
Early registration (before 1st June)
Students 390€ - Professionals 440€
Late registration (after 1st June)
Students 490€ - Professionals 540€
More info and Applications
https://www.ax-om.com/edu/polymorphism/
…
both my plotter/cutter and wide format printer. I had been running the plotter from my main work laptop - a Win10 machine via the plotters USB port. As it turns out you can't get Win XP drivers for this USB connection so I needed another solution.
I tried to use the plotters DB25 serial port connection using an old DB9 to DB25 modem cable I had in my collection = no luck the plotter wouldn't talk. A bit more research and it turns out these plotters need a 'null modem' cross over cable to operate. I found a pic of the correct wiring online and made up my own with some cable and connectors from the local electronics hobby shop.
With this hooked up and using Hyperterminal I was able to fire some codes to the plotter directly and get a response back - winning!
At this point I got my original code working with the 'net use' redirect from LPT1 to COM1.
HOWEVER - being that the plotter was now on a COM port there are a few more interesting things you can do with it - one is being able to read the paper size/cut area from the printer.
So what I needed to to was find a way to send and receive data to/from the plotter using the serial port.
A bit of research into .NET's serial port interface and using a bunch of small pieces of test code I have manged to completely re-jig this driver.
Upgrades include:
- Direct Serial Port comms using Null Modem cable (a USB to serial adaptor + null modem should also work)
- Plot area read from the plotter - a rectangle the size of the plot area is placed on a separate layer and coloured red
- Testing to see if selected plotting curves are both closed and inside of the cutting area - with errors shown and exiting if they are not right.
- After plot 'parking' of the plot head at the end of the cut items + an adjustable offset (currently requires manual resetting of origin on the plotter before for next cut)
Great thing is it is now 100% running within Rhino Python - no DOS command line calls = no flashing up of the CMD wind. Also no temp files needed on the HDD and no limit to number of curves that can be plotted - tested with 200 or so with no issues.
Overall very happy with whole project - have learnt a LOT about Python and .NET interfacing AND ended up with a very handy/useful tool.
Cheers
DK
# This code is a WIP # It plots directly to a DGI Plotter# via the serial port
import System.IO.Ports as Portsimport rhinoscriptsyntax as rsimport time
#Some setup valuescom_port = 'COM1' #change to match plotter port baud_rate = 9600 #change to match plotter settingplotter_step = .025 #mmfinsh_offset = 10 #mm
#Delete old cutting area and cut objectsif rs.IsLayer('Cutting Area'): rs.PurgeLayer('Cutting Area')if rs.IsLayer('Cutting Objects'): rs.PurgeLayer('Cut Objects')
#Setup Serial PortMyport = Ports.SerialPort(com_port)Port_Write = Ports.SerialPort.WriteMyport.BaudRate = baud_rateMyport.ReadTimeout=5000 #5 secsMyport.Close()Myport.Open()
#Setup PlotterPort_Write(Myport, 'PU;PA0,0;IN;\n')Port_Write(Myport, 'SP1;\n')Port_Write(Myport, 'PA;\n')time.sleep(2)
#Read the Paper size from PlotterPort_Write(Myport, 'OH;') #HPGL read limits codetime.sleep(2)
return1 = ''papersize = ''count = 0char_in_buffer = 0chars_in_buffer = Ports.SerialPort.BytesToRead.GetValue(Myport)
if chars_in_buffer == 0: print 'Plotter not ready' Myport.Close() exit()
while (count < chars_in_buffer): return1 = Myport.ReadChar() papersize = papersize + chr(return1) count = count + 1
papersize = papersize.split(",")rect1 = (float(papersize[2])*plotter_step)rect2 = (float(papersize[3])*plotter_step)
print 'Cutting area = ' + str(rect1) + 'x' + str(rect2)
#place cutting area curve on its own layer, make it red and lock itplane = rs.WorldXYPlane()cutting_area = rs.AddRectangle( plane, (rect1), (rect2))rs.AddLayer (name='Cutting Area', color=(255,0,0), visible=True, locked=True, parent=None)rs.ObjectLayer(cutting_area, 'Cutting Area')
#get plotting objects
allCurves = rs.GetObjects("Select curves to plot", rs.filter.curve)
#test to see if these are closed curves - exit if not
for curve in allCurves: test_closed = rs.IsCurveClosed(curve) if test_closed == 0: print "One or move of these curves are not closed" Myport.Close() exit()
#test to see if these are inside cutting area - exit if not
for curve in allCurves: test_inside = rs.PlanarClosedCurveContainment(curve, cutting_area)
if test_inside==0 or test_inside==1: print "One or more of these curves are outside of cut area" Myport.Close() exit()
#All ok - convert to points and send data to printer
rs.AddLayer (name='Cut Objects', color=(0,255,0), visible=False, locked=True, parent=None)
for curve in allCurves: Port_Write(Myport, 'PU;PA;SP1;\n') polyline = rs.ConvertCurveToPolyline(curve,angle_tolerance=5.0, tolerance=0.025, delete_input=False, min_edge_length=0, max_edge_length=0) points = rs.CurveEditPoints(polyline) rs.ObjectLayer(polyline, 'Cut Objects')
# PU to the first point x = points[0][0] y = points[0][1] Port_Write(Myport, 'PU' + str(int(x / plotter_step)) + ',' + str(int(y / plotter_step)) + ';\n') # PD to every subsequent point i = 1 while i < len(points): x = points[i][0] y = points[i][1] Port_Write(Myport, 'PD' + str(int(x / plotter_step)) + ',' + str(int(y / plotter_step)) + ';\n') i += 1
Port_Write(Myport,'PU;\n')
#find the far end of the cutbox = rs.BoundingBox(allCurves)far_end = str(box[1])far_end = far_end.split(",")far_end = far_end[0]far_end = float(far_end)/plotter_stepfar_end = (int(far_end))+ finsh_offsetfar_end = str(far_end)print (far_end)
#return plotter home and close portPort_Write(Myport, 'PU;PA' + far_end + ',0;IN;\n')Port_Write(Myport, 'SP1;\n')Port_Write(Myport, 'PA;\n')Myport.Close()time.sleep(10)…
it seems that was this. Now all is working fine !
Glad that it worked! But I am still a bit worried. Gismo components only modify the gdal-data/osmconf.ini file and no other MapWinGIS file. So your MapWinGIS installation files should not be compromised. The fact that you did not get the "COM CLSID" error message when running the "Gismo Gismo" component suggests that MapWinGIS has been properly installed. So I wonder if the cause for the permanent "invalid shapes" warning has again something with the fact that your system is again not allowing the MapWinGIS to properly edit the osmconf.ini. Maybe this problem will appear again, and again, and reinstallation of MapWinGIS every time can be somewhat bothersome.
- About the terrain generation, is it possible to have the texture from google or other provider mapped onto the terrain surface from gismo component ? (Same as using the ladybug terrain generator in fact). I try to used the image extracted by ladybug component and then applied it to the gismo terrain but the texture is rotated by 90°.
The issue with the rotation can be solved by swapping/reversing the U,V directions of the terrain surface. A slightly more important issue is that terrain surface generated with Gismo "Terrain Generator" component might have a bit smaller radius than what the radius_ input required. This stems from the fact that the terrain data first needs to be downloaded in geographic coordinate system, and then projected. Some projecting issues may occur at the very edges of the projected terrain, so I had to slightly cut out the very edges of the terrain which results in the actual terrain diameters being slightly shorted in both directions. This means that if you apply the same satellite image from Ladybug "Terrain Generator" component to Gismo "Terrain Generator" component the results may not be the same.I attached below a python component which tries to solve this issue by extending the edges of Gismo "Terrain Generator" terrain, and then cutting them with the cuboid of the exact dimensions as the radius_ input. Have in mind that this extension of the original terrain at its edges is not a correct representation of the actual terrain in that location. But rather just an extension of the isoparameteric curve of the terrain surface. So basically: some 0 to 10% (0 to 10 percent of the width and length) of the terrain around all four edges is not the actual terrain for that location, but rather just its extension.The python component is located at the very right of the definition attached below.
Also, if you would like to use the satellite images from Ladybug "Terrain Generator" component along with "OSM shapes", sometimes you may find slight differences in position of the shapes. This is due to openstreetmap data not being based on Google Maps (that's what Ladybug "Terrain Generator" component is using), but rather on Bing, MapQuest and a few others.
- About the requiredKeys_ input of OSM shapes, I understand what you mean and your advice, but in most cases I use it, the component was working fine even without input. I think it's better to extract all tags, values and keys of the selected area, instead of searching for specific ones as I try to find all data related to what I want after, isn't it ? To check what keys are present on the area also.
Ineed, you are correct.I though you were trying to only create a terrain, 3d buildings and maybe find some school or similar 3d building, for these two locations. The recommendation I mentioned previously is due to shapefiles having a limit (2044) to how many keys it can contain. This requires further testing of some big cities locations with maybe larger radii, which I haven't performed due to my poor PC configuration. But in theory, I imagine that it may happen that a downloaded .osm file may have more than 2044 keys. In that case shapefile will only record 2044 of them, and disregard the others. That was my point.But again 2044 is a lot of keys, and I haven't been checking much this in practice. For example, when I set the radius_ to 1000 meters, and use your "3 Rue de Bretonvilliers Paris" location I get around 350 something keys, which is way below the 2044.Another reason why one should use the requiredKeys_ input is to make the Gismo OSM components run quicker: for example, the upper mentioned 350 something keys will result in 350 values for each branch of the "OSM shapes" component's "values" output.Which means if you have 10 000 shapes, the "OSM shapes" component will have 10 000 branches with 350 items on each branch (values). This can make all Gismo OSM components very heavy, and significantly elongate the calculation process.With requiredKeys_ input you may end up with only a couple of tens of items per each branch.Sorry for the long reply.…
Added by djordje to Gismo at 8:57am on June 11, 2017