diseño, construcción y entendimiento de nuestro entorno.
BIM está poniendo a disposición de los diseñadores y gestores auténticas bases de datos que pueden generarse, conectarse y editarse de forma paramétrica, proporcionando una sólida capa de realidad a los ejercicios de diseño generativo y computación que son objeto de estudio en Algomad, el seminario que busca popularizar la programación y la parametrización en el diseño y en la experiencia de nuestro entorno construido.
Tras un paréntesis en 2015, Algomad vuelve con el objetivo de demostrar cómo una visión computacional del BIM es una oportunidad para mejorar la forma de trabajar de ingenieros, arquitectos, constructoras y operadores de edificios e infraestructuras, tendiendo un puente entre las técnicas de diseño digital más avanzadas y la realidad de la construcción.
Algomad 2016 tendrá lugar en el centro de Madrid, en IE School of Architecture and Design, IE University, los días 3, 4 y 5 de Noviembre de 2016 y comprenderá 4 talleres así como ponencias a cargo de expertos de primer nivel.
Estructura de Algomad 2016
Algomad 2016 se estructura en torno a tres áreas temáticas principales:
BIM, como la metodología total específica para el sector de la construcción.
Computación, englobando las aplicaciones de programación y parametrización al diseño de edificios e infraestructuras.
Realidad, como marco de trabajo, buscando siempre resolver problemas reales a través de los dos puntos anteriores.
Público objetivo
Arquitectos, arquitectos técnicos, ingenieros y en general académicos, estudiantes de últimos cursos y profesionales del mundo inmobiliario y de la construcción que compartan un interés por la digitalización de nuestro sector. Se espera un nivel mínimo en el uso de herramientas BIM y de parametrización. Algomad proporcionará formación adicional y gratuita en las herramientas básicas a emplear en los talleres para asegurar un correcto desempeño.…
ther math and logic. i can usually conceptualise what i want to do and cobble some semi working thing together but don't know which components to use and how to patch it. so i'm super happy to have someone who knows what he's doing to find this interesting.
and i'm glad you mention the fanned frets again, there is one input parameter that's still missing for the multiscale frets to be fully parametric, it's the angle of the nut or which fret should be straight. it depends a bit on personal preferences and playing posture what is more comfortable. so being able to adjust this easily would be cool. again i have no idea how the maths for that work or if you can just rotate each fret the same amount around it's middle point. The input either as fret number (for the straight fret) or as a simple slider from bridge to nut should do as input setting.
Here are the two extremes and the middle ground:
i've been thinkin today while analysing your patches and cleaning up my mess what exactly the monster should do.
Here are the input parameters needed, i think it's the complete list
scale length low E string
scale length high e string
fret angle/straight fret
string width at nut
string width at bridge
number of frets
fretboard overhang at nut (distance from string to fretboard bounds)
fretboard overhang at last fret
string gauges
string tensions
fretboard radius at nut (for compound radius fretboard radius at bridge is calculated with the stewmac formula)
fretwire crown width
fretwire crown height
action height at nut (distance between bottom of string and fretwire crown top)
action height at last fret
pickup 1 neck position
pickup 2 middle position
pickup 3 bridge position
nut width
the pickup positions should be used to draw circles for the magnet poles on each string so they are perfectly aligned and can be used for the pickup flatwork construction. ideally they would need a rotation control aligning the center line of the pickup so it's somewher between the last fret angle and bridge angle. personally i do this visually depending on the design i'm looking for, some people have huge theories on pickup positioning but personally i don't believe in it.
that should result in everything needed to quickly generate all the necessary construction curves or geometry for nut/fingerboard/frets/pickups. this is the core of what makes a guitar work, the more precise this dynamic system is the better the guitar plays and sounds.
i posted another thread trying to understand how i could use datasets form spreadsheets,databse, csv to organize the input parameters. What would make sense for the strings for example is hook into a spreadsheet with the different string sets, i attached one for the d'Addario NYXL string line which basically covers all combos that make sense.
The string tension is an interesting one, and implmenting it would sure be overkill albeit super interesting to try. it should be possible to extrapolate from the scale length of each string what the tension for a given string gauge of that string would be so that you could say 'i want a fully balanced set' or 'heavy top light bottom) and it would calculate which SKU from d'addario would best match the required tension. All the strings listed in the spreadsheet are available as single strings to buy.
i'm trying to reorganize everything which helps me understand it. i just discovered the 'hidden wires' feature which is great since once i understood what a certain block does or have finished one of my own, i can get the wires out of the way to carry on undistracted. a bit risky to hide so many wires but it makes it so much easier not to get completely lost :-)
btw, the 'fanned fret' term is trademarked, some guy tried to patent it in the 80's which is a bit silly since it has been done for centuries. there is a level of sophistication above this as well, check out http://www.truetemperament.com/ and that really is something else. it really is astounding how superior the tuning is on those wigglefrets, the problem is that it's rather awkward for string bending and also you can't easily recrown or level the frets when they are used. …
e matching with a dedicated component which creates combinations of items. You can find the [Cross Reference] component in the Sets.List panel.
When Grasshopper iterates over lists of items, it will match the first item in list A with the first item in list B. Then the second item in list A with the second item in list B and so on and so forth. Sometimes however you want all items in list A to combine with all items in list B, the [Cross Reference] component allows you to do this.
Here we have two input lists {A,B,C} and {X,Y,Z}. Normally Grasshopper would iterate over these lists and only consider the combinations {A,X}, {B,Y} and {C,Z}. There are however six more combinations that are not typically considered, to wit: {A,Y}, {A,Z}, {B,X}, {B,Z}, {C,X} and {C,Y}. As you can see the output of the [Cross Reference] component is such that all nine permutations are indeed present.
We can denote the behaviour of data cross referencing using a table. The rows represent the first list of items, the columns the second. If we create all possible permutations, the table will have a dot in every single cell, as every cell represents a unique combination of two source list indices:
Sometimes however you don't want all possible permutations. Sometimes you wish to exclude certain areas because they would result in meaningless or invalid computations. A common exclusion principle is to ignore all cells that are on the diagonal of the table. The image above shows a 'holistic' matching, whereas the 'diagonal' option (available from the [Cross Reference] component menu) has gaps for {0,0}, {1,1}, {2,2} and {3,3}:
If we apply this to our {A,B,C}, {X,Y,Z} example, we should expect to not see the combinations for {A,X}, {B,Y} and {C,Z}:
The rule that is applied to 'diagonal' matching is: "Skip all permutations where all items have the same list index". 'Coincident' matching is the same as 'diagonal' matching in the case of two input lists which is why I won't show an example of it here (since we are only dealing with 2-list examples), but the rule is subtly different: "Skip all permutations where any two items have the same list index".
The four remaining matching algorithms are all variations on the same theme. 'Lower triangle' matching applies the rule: "Skip all permutations where the index of an item is less than the index of the item in the next list", resulting in an empty triangle but with items on the diagonal.
'Lower triangle (strict)' matching goes one step further and also eliminates the items on the diagonal:
'Upper Triangle' and 'Upper Triangle (strict)' are mirror images of the previous two algorithms, resulting in empty triangles on the other side of the diagonal line:
…
lts.
In the visualization, points is an interesting option. It's a matter of aesthetics I guess, I go with surfaces :) Also what you can try is selecting Filters -> Slice (you can also find it in the icons above the pipeline viewer), in the Slice options below the pipeline press Z normal and on the Z coordinate press some height relevant to the buildings (e.g. 1.75m a typical human scale). That would show you the flow around the buildings on that height. Experiment with selecting other normals and values. Keep playing with the filters there's some cool things in there. Also you can check out the mailing list and extensive paraview documentation.
Concerning the errors I apologize because I just downloaded your case.
It appears that the decomposeParDict is not included in the system folder. I am not sure if this is due to BF not going through the whole workflow yet or an ommission on our side. Please feel free to add it in Github. I will also note it down and pass it to Mostaph to check. In the meantime please find attached a VERY detailed decomposeParDict file. I took the liberty to set it at 4 processors (the numberOfSubDomains value) and also selected (that is uncommented) the scotch decomposition method. It's the easiest method to use since it is automatic and doesn't require any more inputs on how the domain is decomposed on the x,y,z directions (which would require you to change values in the attached file).
Now, the different folders created are simply snapshots of the current solution at the specific timestep. To control how often the solver is saving change the writeInterval number in the controlDict file. You can also change almost all these values on the fly, while OF is running.
Finally, concerning the other errors of parafoam it seems somehow parafoam is reading the intial condition names instead of actual results from the solution files and it doesn't like it.
Does this happen only when you open the case (i.e. at 0 time) or does it also happen when you move to an other timestep?
Also, are you using paraFoam, paraview or the paraFoam -builtin method?
The extension of the paraFoam file seems to be .foam which means you are probably using the built in viewer. That might be the issue but I'm not sure.
Can you try running paraview, navigate to your case folder, open the .foam file and see if there is still an error?
Also, if it isn't much trouble can you zip one of the time folders and attach it here? I'd like to take a look at what's inside to check against what the error report says.
Once again thanks for testing!
Kind regards,
Theodore.…
used of 180 being for the northern hemisphere and 0 for the southern hemisphere.For the optimal tilt, to my knowledge, they are mostly based on correcting location's latitude through a single formula.TOF component is more sophisticated. It essentially replicates the Solmetric's Annual Insolation Lookup tool.What it does is that it creates a grid of points. Each point represents the calculated annual insolation on the surface (PV module, SWH collector, facade, any kind of surface) for a single tilt and azimuth angle.Each point is then elevated according to the annual insolation values. The mesh is created from that grid of points. The portion of the mesh which is the highest, represents the optimal tilt and azimuth angles. So the higher your "precision_" input is, the more points in a mesh you'll have - thus the more precise final optimal tilt and azimuth will be.For the diffuse component of the annual incident solar radiation for each point the Perez 1990 modified model is used. Direct is from classical cosine law, and Ground reflected component from Liu and Jordan (1963).So TOF component calculates the optimal tilt and azimuth based on annual incident solar radiation, not AC energy....…
up structural systems in the parametric environment of Grasshopper. Participants will be guided through the basics of analysing and interpreting structural models, to optimisation processes and how to integrate Karamba3d into C# scripts.
This workshop is aimed towards beginner to intermediate users of Karamba however advanced users are also encouraged to apply. It is open to both professional and academic users.
Course Fee:
Professional EUR 750 (+VAT)
Educational EUR 375 (+VAT)
Course Outline
Introduction & Presentation of project examples
Optimization of cross sections of line based and surface based elements
Geometric Optimization
Topological Optimization
Structural Performance Informed Form Finding
Understanding analysis algorithms embedded in Karamba and visualising results
Complex Workflow processes in Rhino3d, Grasshopper3d and Karamba3d
Places are limited to a maximum of 10 participants with limited educational places. A minimum of 4 places are required for the workshop to take place.
The workshop will be cancelled should this quota not be filled by May 31st.
The workshop will be taught in English. Basic Rhino and Grasshopper knowledge is recommended. No knowledge of Karamba is needed.
Participants should bring their own laptops with either Rhino5/Rhino6 and Grasshopper3d installed. A 90 day trial version of Rhino can be downloaded from Rhino3d.
Karamba ½ year licenses for non-commercial use will be provided to all participants.
…
up structural systems in the parametric environment of Grasshopper. Participants will be guided through the basics of analysing and interpreting structural models, to optimisation processes and how to integrate Karamba3d into C# scripts.
This workshop is aimed towards beginner to intermediate users of Karamba however advanced users are also encouraged to apply. It is open to both professional and academic users.
Course Fee:
Professional EUR 750 (+VAT)
Student EUR 375 (+VAT)
Course Outline
Introduction & Presentation of project examples
Optimization of cross sections of line based and surface based elements
Geometric Optimization
Topological Optimization
Structural Performance Informed Form Finding
Understanding analysis algorithms embedded in Karamba and visualising results
Complex Workflow processes in Rhino3d, Grasshopper3d and Karamba3d
Places are limited to a maximum of 10 participants with limited educational places. A minimum of 4 places are required for the workshop to take place.
The workshop will be cancelled should this quota not be filled by October 15th.
The workshop will be taught in English. Basic Rhino and Grasshopper knowledge is recommended. No knowledge of Karamba is needed.
Participants should bring their own laptops with either Rhino5/Rhino6 and Grasshopper3d installed. A 90 day trial version of Rhino can be downloaded from Rhino3d.
Karamba ½ year licenses for non-commercial use will be provided to all participants.
…
ive 'correct' normal.
Non-normalized cross products is effectively weighting face normals by area, and is fast and simple, so we put that one as the default.
In some cases normalizing the cross-products improves the result, but not always.
Another option is to weight by angles, though this is computationally slightly more expensive, so might not be ideal for real-time updates on large meshes.
As an example, here is a mesh with a 90° corner, and uneven meshing on the 2 sides.
The arrows show:
0- Area weighted (non-normalized cross products)
1- Angle weighted
2- Normalized cross-products
Here the angle-weighted normal is the one at 45°, which is intuitively the 'best' one in this case.
These 3 seem to be the most commonly used, but there are many other possible definitions of normals - such as inverse-area weighted, mean curvature, etc...
I think really what would be best would be to put a few of these into Plankton, and include an optional argument in GetNormal for selecting which one you need for a particular application.
Pull requests welcome if you feel inspired to add this!
http://meshlabstuff.blogspot.co.uk/2009/04/on-computation-of-vertex-normals.html
http://steve.hollasch.net/cgindex/geometry/surfnorm.html…
us allows Grasshopper authors to stream geometry to the web in real time. It works like a chatroom for parametric geometry, and allows for on-the-fly 3D model mashups in the web browser. Multiple [Grasshopper] authors can stream geometry into a shared 3D environment on the web – a Platypus Session – and multiple viewers can join that session on 3dplatyp.us to interact with the 3D model. Platypus can be used to present parametric 3D models to a remote audience, to quickly collaborate with other Grasshopper users, or both!
You can down load the Grasshopper plugin at food4rhino, and visit 3dplatyp.us to view your geometry on the web. This first round of Alpha testing will run for two weeks, until April 24 2014, after which the Grasshopper components will not solve.
We are very interested in hearing feedback from the community while the project is still in the prototyping stages of development. Please use the comments on this discussion to ask questions, suggest ideas, report bugs, etc. We are planning on rolling out another public alpha release or two this Spring, depending on how this first one goes, in advance of our Technology Symposium and Hackathon in New York.
Check out our getting started video below, and enjoy!
…
e rod with circular section (no goals allow for controlling torsion for what I know). The rods are set with two options, with straight rest position or the (initial) bent one. The calibration integrated with the model is more about giving a scale between the forces rather than the will to accurately simulate them (at the moment). Anyway, I am trying to do it on a macro scale, instead of a micro, with elements which are rather thin.
The system at the moment is not stable. In fact, besides the rods' characteristics is quite fundamental to keep them planar when they intersect. I am lacking something but also probably missing some parameters. In the script, there are two goals to define this: impose 90° between vertical and horizontal, as well as between these and a normal to their intersection. For my understanding, angle goal works tri-dimensionally without a preferred plane and this (hopefully) should address it.
Just wondering if anyone can give me a hint on this. After this step, it would be great to understand if the system can get out of its plane (through a pull force out of its plane, simulated in the script through point loads in the joints). I am still not entirely sure about the possibility of doing this. By looking at how other auxetic patterns have been used to generate freeform surfaces, I am giving it a try.
Thank you
Claudio
PS: I noticed also this post and this, really interesting. I see the problematic over the stability and the necessity to separate the states with an energetic hill in the first, as well as some potential in using auxetics in the latter.…