be in your definition is impossible.
The 2 (or 3) sides of the cube's corner pieces can never be the same color (see image:)
because they are actually part of the same piece (they always move together).
So in your definition it is as if you have removed the stickers from the cube and replaced them randomly, which results in an unsolvable cube...
In order to start with a properly scrambled cube, I believe you could start with a solved cube and perform a big number of random rotations on it (just like you would do in real life).
On another subject:
"There are over 43 quintillion legal positions of the Rubik’s Cube.
It would take thirteen hundred million years to see every position if you were able to view one thousand per second.
If we stacked 43 quintillion pennies, the stack would be tall enough to reach the sun and return to the earth four thousand billion times."
source: http://b.chrishunt.co/how-many-positions-on-a-rubiks-cube
So, trying to brute-force the Rubik's cube is definitely not the way to go... :)
Of course there is a number of programming algorithms for solving the cube (examples) but I don't know how easy it would be to implement them in GH....
Best of luck and please keep us posted!
Nikos
…
Added by nikos tzar at 10:42am on January 31, 2017
I am currently using a static class that upon initiating the plugin (when one of my components is put inside the canvas) creates (only once) 2 event handlers:
public class Control { private static volatile Control _instance = null; public bool registered;
private Control() { registered = false; }
public static Control getInstance(GH_Document document) { if (_instance == null) { _instance = new Control(); if (!_instance.registered) { document.RaiseEvents = true; document.SolutionStart += _instance.document_SolutionStart; document.SolutionEnd += _instance.document_SolutionEnd; } } return _instance; }
}
Now the idea is that:
the method at solutionstart clears all static variables
a new job file is read and sliders are set accordingly
the components produce their solutions and record them in the static variables
the method at solutionend writes all the data stored in the static variables to a result file and schedules a new solution like:
void document_SolutionEnd(object sender, GH_SolutionEventArgs e) {
e.Document.ScheduleSolution(1);
}
Now the problem is that at each iteration not all sliders are adjusted (some stay constant over multiple iterations) I found that only the components downstream of the sliders that were changed during solutionstart are written to the result while the rest just writes empty data. (because I clear all static variable data at solution start)
So I was wondering if there is a way to schedule a solution that solves all objects in the canvas regardless of them being part of a stream that is adjusted in that stream (like saying "Recompute" in the canvas).
Note:
I guess an alternative could be to have a more intelligent 'clearing' of the static variables in step 1, so that for instance a static variable is cleared only if in that solution data was written by the components. However since I am writing with multiple components to the same static variable (which is a list) it becomes quite involved to track which components did and which didn't change during the solution.
Cheers Dion
…
, please let me know.
Also, I'd like to preface this request with a great deal of gratitude and thanks for creating/working on this software. So - thank you!
1) Variable parameter insertion for clusters
Right now, if you have a cluster and want to add an input/output from the parent document, you have to open the cluster add an input/output, close the cluster, and hook up the cluster. This is all good and well, but if the cluster takes a few seconds to execute, adding/removing clusters becomes pretty cumbersome.
Instead, if there the variable parameter capability was added to cluster (ala the C#/VB/Python components), it would be great. I'm imagining the functionality being:
You zoom into the cluster, and click on the 'plus'/'minus' buttons, which adds/removes an input/output (a generic input param) within the cluster, in an arbitrary location. You can then hook parameters up, and then double-click to enter the cluster. After you find the input components, you can continue hooking them up to the rest of your cluster.
This would be speed up the ability to work fluidly with clusters rapidly.
2) De-clustering functionality
As is with coding, sometimes a cluster makes sense; sometimes it doesn't. Sometimes the wrong aspects of an operation have been clustered, and it makes sense to undo the entire thing. But when this happens, you have to copy-paste from within a cluster, and have to manually hook up all of the lines back together, which is a great annoyance.
It would help greatly to right-click on a cluster, select 'decluster', and have the cluster un-clustered, with all of the connections reconnected in their original position/location. By making the 'cluster' operation work both ways, I think this would also really enhance a cluster workflow.
This would also solve the problem of adding things into a cluster. Normally, you have to copy components, paste them into the cluster, and re-link them within, which is a simple operation, yet takes a minute or so. With this functionality, you'd be able to decluster, select the declustered components + new components, and cluster again -- quickly adding components/items to a cluster.
Thanks for hearing me out!
Best,
Dan…
Added by Dan Taeyoung at 3:41pm on January 10, 2014
es which you can see below in my mesh repair report I ran on the mesh after baking it.
This is a bad mesh.
Here is what is wrong with this mesh: Mesh has 2 non manifold edges. <<------because of the duplicate face Mesh has 1 duplicate face. Skipping face direction check because of positive non manifold edge count.
General information about this mesh: Mesh does not have any degenerate faces. Mesh does not have any extremely short edges. Mesh does not have any naked edges. Mesh does not have any self intersecting faces. Mesh does not have any disjoint pieces. Mesh does not have any unused vertices.
Continuing the repair process does rid the duplicate face. I also realized that rhino 5 has the command called "ExtractDuplicateMeshFaces" which works quite nicely.
However, this "method" is not available in rhinocommon currently. So I just wonder who can I ask to add it? It seems it would make sense to be in rhinocommon considering we have methods available for each of the other tests run by mesh repair. The reason I am interested in this command is it seems to work very fast.
Thanks
…
Added by Michael Pryor at 12:53am on December 11, 2014
, one is office and the other one is hotel for example.
- a list will control the points height
- to each of this point I am attaching a plane
- this plane will intersect the skin of the tower
- from this intersection I am try to take out the floor plate (as Curve) and the area ( as Data)
I think I've got most of the script. but I have some error that I can't figure it out.
Please if you have time can you look to this code?
thank you very much
Private Sub RunScript(ByVal skin As Brep, ByVal f1 As Integer, ByVal DELTAh1 As Double, ByVal Lobby1h As Double, ByVal f2 As Object, ByVal DELTAh2 As Object, ByVal Lobby2h As Object, ByRef A As Object, ByRef C As Object) 'Declare the list of Point on which attach the planes for intersections Dim p As New List(Of Point3d) 'first sector office Dim q As New List(Of Point3d) ' second sector hotel Dim floorPlate As New List(Of Curve) 'Declare a list for storing the curve
'for loop condition statement Dim i As Integer ' for the office Dim j As Integer ' for the hotel For i = Lobby1h To f1 + Lobby1h For j = f1 + Lobby1h + Lobby2h To f2 + (f1 + Lobby1h + Lobby2h)
p.Add(New Point3d(0, 0, i * DELTAh1)) ' points
Dim pl As New Plane(New Point3d(0, 0, i * DELTAh1), Vector3d.ZAxis) 'Planes Dim iCurves As curve() 'temporary store the curves Dim not_used_pt As Point3d() 'points needed only for the Intersect function
Intersect.Intersection.BrepPlane(skin, pl, 0.1d, iCurves, not_used_pt) floorPlate.Add(iCurves(0)) 'if you assume only once curve for each intersection
Next Next
Dim Areas As New list (Of Double) 'I will try to have a summ of all te areas 'Dim Areatot As Double
Dim k As Integer ' for loop to iterate the floorplates For K = 0 To floorPlate.Count - 1 Areas = AreaMassProperties.Compute(floorPlate)
Next
A = Areas.Area C = floorPlate
End Sub …
he picture (4).
Previously, I had a problem with generating intersections between the two directions of the beams, but a colleague helped me by extending beams, so there was no problem with lines of intersection. But this solution has generated curl (5) at the highest vertex geometry, which I ignored in order to repair it before printing, perhaps this mean my problem with my beam spread properly. Only when the beams is 19, does not jump no problem, but I still can not distribute them properly.
(1)
(2)
(3)
(4)
(5)
I tried to show as simply as possible by removing or signing my code in GHX file.
Thank you in advance for your help
…
.
I had updated the components a few weeks ago but then got too lazy/busy to properly document that anywhere. Some of the additional features are:
1. It is now possible to substitute an IES file with a text string. For example one can paste the contents of an IES file into a text panel and connect that to the input for _iesFilePath. Alternatively, you can read a text file using the native Grasshopper "Read File" component, then embed (and internalize) that information inside the "Text" component.
So, either of the below two options will(should) qualify as an input for the _iesFilePath:
This makes it possible to embed IES data inside a GrassHopper file, thus doing away with the need for connecting to a file on a local drive.
2. I created a new component called Honeybee_IES Project which does the following:
1. It consolidates all the electric lighting RAD files for a simulation in one place. The single radFilePaths output from the component can be connected to the daylight simulation instead of connecting individual radFilePath outputs from every luminaire.
2. It creates a BOQ and luminaire schedule for all the luminaires used in the simulation. The schedule can either be viewed in a Grasshopper text panel or exported as excel.
The values for LLF, Candela Multiplier and Lamp Depreciation factor are printed out for each luminaire.
The effect of the multipliers on power consumption can be seen in the BOQ in the Total Power column:
Adding lumens to the output will be minor fix. I will update that within a few days.
I think the point-grid for the photometric and peak candela display are a great idea. I will add that functionality within a couple of weeks.
Are you implying the inclusion Type B photometry by "support for all ies file types" ? If so, that has been on my to-do list for a while. It might, however, be a while before I can get to it as it would require writing a convertor from Type C to Type B so that it can be visualized as a photometric mesh inside the Rhino viewport. I think the hackish way to get Type B photometry to work in Honeybee is to first convert the Type B photometry to Type C using something like the Photometric Toolbox.
Finally, the electric lighting components were initially written as a hack and they are still pretty much work-in-progress. I agree that calling the simulation a lighting simulation and adding separate inputs for electric lights would be a cleaner way of approaching these simulations. Mostapaha and I weren't sure of the traction that these new features might get. Based on the feedback received we will be simplifying and enhancing these components and the workflow to do electric lighting simulations.
(PS: Although I have heard a lot about Accelerad, due to the lack of compatible resources, I have never run a gpu-based simulation myself. I am not sure if Nathaniel requires additional flags or information to run Radiance simulations through Accelerad. If not, it should be possible to use files written through Honeybee to run Accelerad simulations. I will defer to Mostapha on the possibility of incorporating Accelerad in the Honeybee project).
…
ion of both Ladybug and Honeybee. Notable among the new components are 51 new Honeybee components for setting up and running energy simulations and 15 new Ladybug components for running detailed comfort analyses. We are also happy to announce the start of comprehensive tutorial series on how to use the components and the first one on getting started with Ladybug can be found here:
https://www.youtube.com/playlist?list=PLruLh1AdY-Sj_XGz3kzHUoWmpWDXNep1O
A second one on how to use the new Ladybug comfort components can be found here:
https://www.youtube.com/playlist?list=PLruLh1AdY-Sho45_D4BV1HKcIz7oVmZ8v
Here is a short list highlighting some of the capabilities of this current Honeybee release:
1) Run EnergyPlus and OpenStudio Simulations - A couple of components to export your HBZones into IDF or OSM files and run energy simulations right from the grasshopper window! Also included are several components for adjusting the parameters of the simulations and requesting a wide range of possible outputs.
2) Assign EnergyPlus Constructions - A set of components that allow you to assign constructions from the OpenStudio library to your Honeybee objects. This also includes components for searching through the OpenStudio construction/material library and components to create your own constructions and materials.
3) Assign EnergyPlus Schedules and Loads - A set of components for assigning schedules and Loads from the Openstudio library to your Honeybee zones. This includes the ability to auto-assign these based on your program or to tweak individual values. You can even create your own schedules from a stream of 8760 values with the new “Create CSV Schedule” component. Lastly, there is a component for converting any E+ schedule to 8760 values, which you can then visualize with the standard Ladybug components
4) Assign HVAC Systems - A set of components for assigning some basic ASHRAE HVAC systems that can be run with the Export to OpenStudio component. You can even adjust the parameters of these systems right in Grasshopper.
Note: The ASHRAE systems are only available for OpenStudio and can’t be used with Honeybee’s EnergyPlus component. Also, only ideal air, VAV and PTHP systems are currently available but more will be on their way soon!
5) Import And Visualize EnergyPlus Results - A set of components to import numerical EnergyPlus simulation results back into grasshopper such that they can be visualized with any of the standard Ladybug components (ie. the 3D chart or Psychrometric chart). Importers are made for zone-level results as well as surface results and surfaces results can be easily separated based on surface type. This also means that E+ results can be analyzed with the new Ladybug comfort calculator components and used in shade or natural ventilation studies. Lastly, there are a set of components for coloring zone/surface geometry with EnergyPlus results and for coloring the shades around zones with shade desirability.
6) Increased Radiance and Daysim Capabilities - Several updates have also been made to the existing Radiance and Daysim components including parallel Radiance Image-based analysis.
7) Visualize HBObject Attributes - A few components have been added to assist with setting up honeybee objects and ensuing the the correct properties have been assigned. These include components to separate surfaces based on boundary condition and components to label surfaces and zones with virtually any of their EnergyPlus or Radiance attributes.
8) WIP Grizzly Bear gbxml Exporter - Lastly, the release includes an WIP version of the Grizzly Bear gbXML exporter, which will continue to be developed over the next few months.
And here’s a list of the new Ladybug capabilities:
1) Comfort Models - Three comfort models that have been translated to python for your use in GH: PMV, Adaptive, and Outdoor (UTCI). Each of these models has a “Comfort Calculator” component for which you can input parameters like temperature and wind speed to get out comfort metrics. These can be used in conjunction with EPW data or EnergyPlus results to calculate comfort for every hour of the year.
2) Ladybug Psychrometric Chart - A new interactive psychrometric chart that was made possible thanks to the releasing of the Berkely Center for the Built Environment Comfort Tool Code (https://github.com/CenterForTheBuiltEnvironment/comfort-tool). The new psychrometric chart allows you to move the comfort polygon around based on PMV comfort metrics, plot EPW or EnergyPlus results on the psych chart, and see how many hours are made comfortable in each case. The component also allows you to plot polygons representing passive building strategies (like internal heat gain or evaporative cooling), which will adjust dynamically with the comfort polygon and are based on the strategies included in Climate Consultant.
3) Solar Adjusted MRT and Outdoor Shade Evaluator - A component has been added to allow you to account for shortwave solar radiation in comfort studies by adjusting Mean Radiant Temperature. This adjusted MRT can then be factored into outdoor comfort studies and used with an new Ladybug Comfort Shade Benefit Evaluator to design outdoor shades and awnings.
4) Wind Speed - Two new components for visualizing wind profile curves and calculating wind speed at particular heights. These allow users to translate EPW wind speed from the meteorological station to the terrain type and height above ground for their site. They will also help inform the CFD simulations that will be coming in later releases.
5) Sky Color Visualizer - A component has been added that allows you to visualize a clear sky for any hour of the year in order to get a sense of the sky qualities and understand light conditions in periods before or after sunset.
Ready to Start?
Here is what you will need to do:
Download Honeybee and Ladybug from the same link here. Make sure that you remove any old version of Ladybug and Honeybee if you have one, as mentioned on the Ladybug group page.
You will also need to install RADIANCE, DAYSIM and ENERGYPLUS on your system. We already sent a video about how to get RADIANCE and Daysim installed (link). You can download EnergyPlus 8.1 for Windows from the DOE website (http://apps1.eere.energy.gov/buildings/energyplus/?utm_source=EnergyPlus&utm_medium=redirect&utm_campaign=EnergyPlus%2Bredirect%2B1).
“EnergyPlus is a whole building energy simulation program that engineers, architects, and researchers use to model energy and water use in buildings.”
“OpenStudio is a cross-platform (Windows, Mac, and Linux) collection of software tools to support whole building energy modeling using EnergyPlus and advanced daylight analysis using Radiance.”
Make sure that you install ENERGYPLUS in a folder with no spaces in the file path (e.g. “C:\Program Files” has a space between “Program” and “Files”). A good option for each is C:\EnergyPlusV8-1-0, which is usually the default locations when you run the downloaded installer.
New Example Files!
We have put together a large number of new updated example files and you should use these to get yourself started. You can download them from the link on the group page.
New Developers:
Since the last release, we have had several new members join the Ladybug + Honeybee developer team:
Chien Si Harriman - Chien Si has contributed a large amount of code and new components in the OpenStudio workflow including components to add ASHRAE HVAC systems into your energy models and adjust their parameters. He is also the author of the Grizzly Bear gbxml exporter and will be continuing work on this in the following months.
Trygve Wastvedt - Trygve has contributed a core set of functions that were used to make the new Ladybug Colored Sky Visualizer and have also helped sync the Ladybug Sunpath to give sun positions for the current year of 2014
Abraham Yezioro - Abraham has contributed an awesome new bioclimatic chart for comfort analyses, which, despite its presence in the WIP tab, is nearly complete!
Djordje Spasic - Djordje has contributed a number of core functions that were used to make the new Ladybug Wind Speed Calculator and Wind Profile Visualizer components and will be assisting with workflows to process CFD results in the future. He also has some more outdoor comfort metrics in the works.
Andrew Heumann - Andrew contributed an endlessly useful list item selector, which can adjust based on the input list, and has multiple applications throughout Ladybug and Honeybee. One of the best is for selecting zone-level programs after selecting an overall building program.
Alex Jacobson - Alex also assisted with the coding of the wind speed components.
And, as always, a special thanks goes to all of our awesome users who tested the new components through their several iterations. Special thanks goes to Daniel, Michal, Francisco, and Agus for their continuous support. Thanks again for all the support, great suggestions and comments. We really cannot thank you enough.
Enjoy!,
Ladybug + Honeybee Development Team
PS: If you want to be updated about the news about Ladybug and Honeybee like Ladybug’s Facebook page (https://www.facebook.com/LadyBugforGrasshopper) or follow ladybug’s twitter account (@ladybug_tool).
…
Sunpath component. This is what I essentially did in the upper photo of my hometown's mask.For now this is working only for metric units. I will make sure that for the next release at least feets are supported as well.As for the saving of the time: try increasing the maxVisibilityRadius_ to say 300. Depending on your PC configuration and internet speed it may take as long as 15 minutes for the component to run. The topography file will first be downloaded from opentopography.org. That's the .tif file you noticed. Once the mask is created it will be saved to an .obj file. The next time you run it the mask will be imported from the .obj file, skipping the previous 15 minutes:
It still may take a a couple of minutes (depending on your PC configuration) for the component to complete loading of the mask. The reason why is: the mask needs to be scaled and centered according to the context_ input.Also the next time a user decides to change the maskStyle_ input or context_ input, the topography data will not be downloaded from the opentopography.org website, but rather created from the .tif file.For default maximalVisibility_ of 100, these .tif files are mostly a couple of megabytes, which is not that much of a burden on user's hard drive space. On the other hand keeping these .tif files on user's hard drive helps saving the opentopography bandwidth cap.Let me know if I can answer any further detail or if this one hasn't been clear.Hi Chris,
Thank you too.Please provide the following data:
1) Zip the "terrain shading mask libraries 32-bit" folder in "c:\ladybug" in case you have x86 version of Rhino 5, or "terrain shading mask libraries 64-bit" folder in case you have the 64 bit version of Rhino 5. Upload the zipped folder, and post the link in here, please.Zip the whole folder, not its content only.
2) What is the full name of the GDAL libraries .zip file that you downloaded? What is your Windows version and Rhino 5 version?
On genDayMtx.exe and install of the GDAL libraries: I am reluctant to avoid manual install due to blocking issue. Copying two folders manually is quite a small price to pay in comparison with finding the blocked library among tens of them.…