ther math and logic. i can usually conceptualise what i want to do and cobble some semi working thing together but don't know which components to use and how to patch it. so i'm super happy to have someone who knows what he's doing to find this interesting.
and i'm glad you mention the fanned frets again, there is one input parameter that's still missing for the multiscale frets to be fully parametric, it's the angle of the nut or which fret should be straight. it depends a bit on personal preferences and playing posture what is more comfortable. so being able to adjust this easily would be cool. again i have no idea how the maths for that work or if you can just rotate each fret the same amount around it's middle point. The input either as fret number (for the straight fret) or as a simple slider from bridge to nut should do as input setting.
Here are the two extremes and the middle ground:
i've been thinkin today while analysing your patches and cleaning up my mess what exactly the monster should do.
Here are the input parameters needed, i think it's the complete list
scale length low E string
scale length high e string
fret angle/straight fret
string width at nut
string width at bridge
number of frets
fretboard overhang at nut (distance from string to fretboard bounds)
fretboard overhang at last fret
string gauges
string tensions
fretboard radius at nut (for compound radius fretboard radius at bridge is calculated with the stewmac formula)
fretwire crown width
fretwire crown height
action height at nut (distance between bottom of string and fretwire crown top)
action height at last fret
pickup 1 neck position
pickup 2 middle position
pickup 3 bridge position
nut width
the pickup positions should be used to draw circles for the magnet poles on each string so they are perfectly aligned and can be used for the pickup flatwork construction. ideally they would need a rotation control aligning the center line of the pickup so it's somewher between the last fret angle and bridge angle. personally i do this visually depending on the design i'm looking for, some people have huge theories on pickup positioning but personally i don't believe in it.
that should result in everything needed to quickly generate all the necessary construction curves or geometry for nut/fingerboard/frets/pickups. this is the core of what makes a guitar work, the more precise this dynamic system is the better the guitar plays and sounds.
i posted another thread trying to understand how i could use datasets form spreadsheets,databse, csv to organize the input parameters. What would make sense for the strings for example is hook into a spreadsheet with the different string sets, i attached one for the d'Addario NYXL string line which basically covers all combos that make sense.
The string tension is an interesting one, and implmenting it would sure be overkill albeit super interesting to try. it should be possible to extrapolate from the scale length of each string what the tension for a given string gauge of that string would be so that you could say 'i want a fully balanced set' or 'heavy top light bottom) and it would calculate which SKU from d'addario would best match the required tension. All the strings listed in the spreadsheet are available as single strings to buy.
i'm trying to reorganize everything which helps me understand it. i just discovered the 'hidden wires' feature which is great since once i understood what a certain block does or have finished one of my own, i can get the wires out of the way to carry on undistracted. a bit risky to hide so many wires but it makes it so much easier not to get completely lost :-)
btw, the 'fanned fret' term is trademarked, some guy tried to patent it in the 80's which is a bit silly since it has been done for centuries. there is a level of sophistication above this as well, check out http://www.truetemperament.com/ and that really is something else. it really is astounding how superior the tuning is on those wigglefrets, the problem is that it's rather awkward for string bending and also you can't easily recrown or level the frets when they are used. …
e matching with a dedicated component which creates combinations of items. You can find the [Cross Reference] component in the Sets.List panel.
When Grasshopper iterates over lists of items, it will match the first item in list A with the first item in list B. Then the second item in list A with the second item in list B and so on and so forth. Sometimes however you want all items in list A to combine with all items in list B, the [Cross Reference] component allows you to do this.
Here we have two input lists {A,B,C} and {X,Y,Z}. Normally Grasshopper would iterate over these lists and only consider the combinations {A,X}, {B,Y} and {C,Z}. There are however six more combinations that are not typically considered, to wit: {A,Y}, {A,Z}, {B,X}, {B,Z}, {C,X} and {C,Y}. As you can see the output of the [Cross Reference] component is such that all nine permutations are indeed present.
We can denote the behaviour of data cross referencing using a table. The rows represent the first list of items, the columns the second. If we create all possible permutations, the table will have a dot in every single cell, as every cell represents a unique combination of two source list indices:
Sometimes however you don't want all possible permutations. Sometimes you wish to exclude certain areas because they would result in meaningless or invalid computations. A common exclusion principle is to ignore all cells that are on the diagonal of the table. The image above shows a 'holistic' matching, whereas the 'diagonal' option (available from the [Cross Reference] component menu) has gaps for {0,0}, {1,1}, {2,2} and {3,3}:
If we apply this to our {A,B,C}, {X,Y,Z} example, we should expect to not see the combinations for {A,X}, {B,Y} and {C,Z}:
The rule that is applied to 'diagonal' matching is: "Skip all permutations where all items have the same list index". 'Coincident' matching is the same as 'diagonal' matching in the case of two input lists which is why I won't show an example of it here (since we are only dealing with 2-list examples), but the rule is subtly different: "Skip all permutations where any two items have the same list index".
The four remaining matching algorithms are all variations on the same theme. 'Lower triangle' matching applies the rule: "Skip all permutations where the index of an item is less than the index of the item in the next list", resulting in an empty triangle but with items on the diagonal.
'Lower triangle (strict)' matching goes one step further and also eliminates the items on the diagonal:
'Upper Triangle' and 'Upper Triangle (strict)' are mirror images of the previous two algorithms, resulting in empty triangles on the other side of the diagonal line:
…
ake a modest notice about the two new Ladybug components, one of which creates a 3d terrain shading mask and another one which visualizes and exports horizon angles. A terrain shading mask is essentially a diagram which maps the silhouette of the surrounding terrain (hills, valleys, mountains, tree tops...) around the chosen location, and account for the shading losses from the terrain. It can be used as a context_ input in mountainous or higher latitude regions for any kind of sun related analysis: sunlight hours analysis, solar radiation analysis, view analysis, photovoltaics/solar water heating sunpath shading...
My home town is an example of the shading caused by the terrain. Here is how it looks from the tallest building in the town:
And the created terrain shading mask:
A mask for any land location up to 60 degrees North can be created:
There will also be a support for a few major cities above this limit.
Both Terrain shading mask and Horizon angles components can be downloaded from here. An example .gh file can be found in here.
Component will prompt the user to download and copy certain files in order to be able to run.
It was created with assistance from Dr. Bojan Savric. Support on various issues was further given by: Dr. Graham Dawson, Dr. Alec Bennett, Dr. Ulrich Deuschle, Andrew T. Young, LiMinlu, Jonathan de Ferranti, Michal Migurski, Christopher Crosby, Even Rouault, Tamas Szekeres, Izabela Spasic, Mostapha Sadeghipour Roudsari, Dragan Milenkovic, Chen Weiqing, Menno Deij-van Rijswijk and gis.stackexchange.com community.
I hope somebody might find the components useful.…
st between those two applications. But as soon as every frame is re-calculated I noticed that intersection function is very slow. It is actually so slow, that maximum number of polygons to play with is only 10 or less.
Could you help me to find a faster solution for my script?
calculation of intersection lines;
//////////////////////////////////////////////////////////////////////////////////////////
import ghpythonlib.components as ghcompimport rhinoscriptsyntax as rsdef ctr(crv): pts = ghcomp.Explode(crv)[1] pts = ghcomp.CullDuplicates(pts,0.001)[0] return ghcomp.Average(pts)pts = []lines = []ctr_c1 = ctr(C1)for crv in C2: if ctr(crv) != ctr_c1: int = ghcomp.CurveXCurve(C1, crv)[0] if int: [pts.append(x) for x in int] lines.append(rs.AddLine(int[0],int[1]))
/////////////////////////////////////////////////////////////////////////////////////////////
The overall description of the script:
a)Processing+ghowl is used for moving objects and physics
b)python script (slowest part) calculates intersection lines
c)intersected parts of polygons are rotated in 90 degrees.
I have attached grasshopper and processing files. (processing is not necessary to test the script)
Thank you in advance,
Pereas.
…
simple, there are many symetries in 3 main planes. So I used arcs rotated 45° from the main planes and I generate a pentagon which was mirrored and rotated many times.
At the end there are 24 pentagons and 8 hexagons so 32 faces, 54 points/vertex and 84 edges.
It could generate some others tessalation styles
…
see in my bottom post image there is only one isocurve showing in U and V.
In Grasshopper there's no surface rebuild? Well, the same old Grasshopper Patch command will let you specify spans I guess, to make a surface from a planar curve, but it won't work for things with holes since they will just fill in!
You can recreate a surface painfully by untrimming, adding many UV points, rebuilding from those points, then retrimming with the original surface info, but the retrimming simply fails.
If you make a planar surface from a curve in Rhino, you end up with utterly no point editability:
No wonder my CreatePatch tests were a failure. The starting surface could not be distorted except in the extreme case of moving four corner points!
I have no idea how to successfully rebuild a surface akin to the Rhino rebuild command. It's great to be able to prototype in Grasshopper, but with Python I can rebuild easily ( http://4.rhino3d.com/5/rhinocommon/?topic=html/M_Rhino_Geometry_Surface_Rebuild.htm ;), so I guess I should start a collection, like peter, of little script components for prototyping with.…
Added by Nik Willmore at 6:18am on February 26, 2016
nowledge, tools, materials and machines. The Clusters provide a focus for workshop participants working together within a common framework.
Clusters provide a forum for the exchange of ideas, processes and techniques and act as a catalyst for design resolution. The Workshop is made up of ten Clusters that respond in diverse ways to the sg2012 Challenge Material Intensities. The Call for Clusters is now open to proposals which respond in innovative ways to this year's challenge.
Deadline: September 19 2011
More information can be found here:
http://smartgeometry.org/index.php?option=com_content&view=article&id=129&Itemid=146
sg2012 takes place from 19-24 March 2012 at EMPAC (http://empac.rpi.edu/) and is hosted by Rensselaer Polytechnic Institute in Troy, upstate New York USA. The Workshop and Conference will be a gathering of the global community of innovators and pioneers in the fields of architecture, design and engineering.
The event will be in two parts: a four day Workshop 19-22 March, and a public conference beginning with Talkshop 23 March, followed by a Symposium 24 March. The event follows the format of the highly successful preceding events sg2010 Barcelona and sg2011 Copenhagen.
sg2012 Challenge Material Intensities
Simulation, Energy, Environment
Imagine the design space of architecture was no longer at the scale of rooms, walls and atria, but that of cells, grains and vapour droplets. Rather than the flow of people, services, or construction schedules, the focus becomes the flow of light, vapour, molecular vibrations and growth schedules: design from the inside out.
The sg2012 challenge, Material Intensities, is intended to dissolve our notion of the built environment as inert constructions enclosing physically sealed spaces. Spaces and boundaries are abundant with vibration, fluctuating intensities, shifting gradients and flows. The materials that define them are in a continual state of becoming: a dance of energy and information.Material potential is defined by multiple properties: acoustical, chemical, electrical, environmental, magnetic, manufacturing, mechanical, optical, radiological, sensorial, and thermal. The challenge for sg2012 Material Intensities is to consider material economy when creating environments, micro-climates and contexts congenial for social interaction, activities and organisation. This challenge calls for design innovation and dialogue between disciplines and responsibilities.sg2010 Working Prototypes strove to emancipate digital design from the hard drive by moving from the virtual to the actual in wrestling with the tangible world of physical fabrication. sg2011 Building the Invisible focused on informing digital design with real world data. sg2012 Material Intensities strives to energise our digital prototypes and infuse them with material behaviour. They have the potential to become rich simulations informed by the material dynamics, chemical composition, energy flows, force fields and environmental conditions that feed back into the design process.
More information can be found at http://www.smartgeometry.org…
umbrella of Urban Heat Island (UHI) and I am going to try to separate them out in order to give you a sense of the current capabilities in LB+HB.
1) UHI as defined as a recorded elevated air temperature in an urban area:
If you have access to epw files for both an urban area and a rural area, you can use Ladybug to visualize and deeply explore the differences between the two weather files. Ladybug is primarily a tool for weather file visualization and analysis and it can be very helpful for understanding the consequences of UHI on strategies for buildings or on comfort. This said, if you do not have both rural and urban recorded weather data or you want to generate your own weather files based on criteria about urban areas (as it sounds like you want to do), this definition might not be so helpful.
2) UHI defined by air elevated air temperature but viewed as a computer model-able phenomenon resulting primarily from urban canyon geometry, building materials, and (to a lesser degree) anthropogenic heat:
This definition seems to fit more with they type of thing that you are looking for but it is unfortunately very difficult and computationally intensive such that we do not currently have anything within Ladybug to do this right now. I can say that the state-of-the art for this type of modeling is an application called Town Energy Budget (TEB) and this is what all of the advanced UHI researches that I know use (http://www.cnrm.meteo.fr/surfex/spip.php?article7). Unfortunately for those trying to use it in professional practice, it can take a while to get comfortable with it and it currently runs exclusively on Linux (this does mean that it is open source, though, and that you can really get deep into the assumptions of the model). A couple years ago, a peer of mine translated almost all of TEB into Matlab language making it possible to run it on Windows if you have Matlab. He wrapped everything together into a tool called the Urban Weather Generator (UWG), which can take an epw file of a rural area and warp it to an urban area based on inputs that you give of building height, materials, vegetation, anthropogenic heat, etc. I would recommend looking into this for your project, although, bear in mind that is it not open source like the original TEB tool and that you may need to get a (very expensive) copy of MATLAB (http://urbanmicroclimate.scripts.mit.edu/uwg.php).
3) UHI as defined by a thermal satellite image of an urban area depicting an elevated average radiant environment that reaches a maximum a the city center and changes by land use:
This is the definition of UHI that I am most familiar with and was the basis of much of my past research. I feel that it is also a definition of UHI that is a bit more in line with where a lot of contemporary UHI research is headed, which is away from the notion of UHI as a macro-scale meteorological phenomena that is averaged as an air temperature over a huge area towards one that accepts that different land uses have different microclimates and (importantly) different radiant environments. While the air temperature difference between urban and rural areas usually does not change more than 1-4 C, the radiant environment can be very different (on the order of 10-15 C differences). The best way to understand UHI in this context is with Thermal satellite images, for which there is ha huge database of publicly available data on NASA's glovis website (http://glovis.usgs.gov/) or their ECHO website (http://reverb.echo.nasa.gov/reverb/#utf8=%E2%9C%93&spatial_map=satellite&spatial_type=rectangle). I tend to use thermal data from LANDSAT 5-8 and ASTER satellites in my research. Unfortunately, there is a lot f bad data with a lot of cloud cover mixed in with the really good stuff and it can take some time to find good images. Also, there aren't too many programs that read the GeoTiff file format that you download the data as. I know that ArcGIS will read it, a program called ENVI will read it (I think that the open source QGIS can also red it). I have plans to write a set of components to bring this type of data into Rhino and GH (I may get to it a few months down the line).
4) UHI as a computer model-able notion of "Urban Microclimate" with consideration of local differences and the local radiant environment:
This is where a lot of my research has lead and, thankfully, is an area that Honeybee can help you out a lot with. EnergyPlus simulations can output information on outside building surface temperatures and these can be very helpful in helping get a sense of the radiant environment around individual buildings. Right now, I am focusing just on using this data to fully model the indoor environments of buildings as you see in this video:
https://www.youtube.com/watch?v=fNylb42FPIc&list=UUc6HWbF4UtdKdjbZ2tvwiCQ
I have plans to move this methodology to the outdoors once I complete this initial application to the indoors. For now, you can use the "Surface result reader" and the "color surfaces based on EP result" components to get a sense of variation in the outside temperature of your buildings.
I hope that this helped,
-Chris
…