Crow Discussions - Grasshopper2024-03-28T19:47:58Zhttps://www.grasshopper3d.com/group/crow/forum?feed=yes&xn_auth=noTime series forecastingtag:www.grasshopper3d.com,2021-01-18:2985220:Topic:21369402021-01-18T16:04:57.806ZSebastian Clark Kothhttps://www.grasshopper3d.com/profile/SebastianClarkKoth
<p></p>
<p>Hello,</p>
<p>we are four students who want to develop a Grasshopper-based tool to support architects and urban planners in their decision making.</p>
<p>Our idea for this is to forecast urban growth using classified images (in the form of a time series). At first, the idea was to take time series of satellite imagery and label them accordingly into urban and non-urban. Since this step alone seemed too time consuming for now, we took a different approach. We photographically…</p>
<p></p>
<p>Hello,</p>
<p>we are four students who want to develop a Grasshopper-based tool to support architects and urban planners in their decision making.</p>
<p>Our idea for this is to forecast urban growth using classified images (in the form of a time series). At first, the idea was to take time series of satellite imagery and label them accordingly into urban and non-urban. Since this step alone seemed too time consuming for now, we took a different approach. We photographically documented the growth of the slime mold Physarum Polycephalum in a Petri dish (about 2000 images in 2-min increments) and then labeled them into fungal and non-fungal.</p>
<p>We would now like to use this labeled data (in the form of a list of black levels for each pixel) to make a prediction using Crow. <br/>Our first consideration is to use as input the pixel values of time step n and as output those of time step n+X and to train the network in this way (X could be 1, i.e. 2 min in the future, or 10, i.e. 20 min in the future). We would then use a later time step as a test, which is not included in the training set. So we could then compare the real output with the generated one.</p>
<p>Now the questions:</p>
<p>How realistic is it to perform such a prediction using the backpropagation approach in Crow? Especially also with regard to the structure of input and output (both times long lists of pixel values)? Will this detect the patterns of the temporal component, even if they are not explicitly stored in the inputs and outputs?</p>
<p>Or in general, is it realistic to do this kind of time prediction with Crow?<br/><br/>Thank you to anyone in advance who might be able to help!</p> The problem of loading Crowtag:www.grasshopper3d.com,2020-05-13:2985220:Topic:20658732020-05-13T09:33:18.320ZJimmyLee871013https://www.grasshopper3d.com/profile/JimmyLee871013
<p>Hi~~~~</p>
<p></p>
<p><span>I found it is impossible to use Crow in grasshopper. the Rhino said :</span><br></br><span>An error occured during GHA assembly loading:</span><br></br><span>Path: C:\Users\17641\AppData\Roaming\Grasshopper\Libraries\Crow\Crow.gha</span><br></br><span>Exception System.TypeLoadException: </span><br></br><span>Message: Could not load type 'Crow.Core.Backpropagation.BackpropagationNetwork' from assembly 'core, Version=0.0.0.0, Culture=neutral,…</span></p>
<p>Hi~~~~</p>
<p></p>
<p><span>I found it is impossible to use Crow in grasshopper. the Rhino said :</span><br/><span>An error occured during GHA assembly loading:</span><br/><span>Path: C:\Users\17641\AppData\Roaming\Grasshopper\Libraries\Crow\Crow.gha</span><br/><span>Exception System.TypeLoadException: </span><br/><span>Message: Could not load type 'Crow.Core.Backpropagation.BackpropagationNetwork' from assembly 'core, Version=0.0.0.0, Culture=neutral, PublicKeyToken=null'.</span><br/><br/><span>I think because the Crow in based on C# library NeuronDotNet, and I did not installed this library in my laptop, the laptop cannot load the library.</span><br/><br/><span>I already download NeuronDotNet from the website. But I do not know how to install it in my laptop. So can you explain my problem ? And if the problem is NeuronDotNet library, please help me to install in my laptop. THX~~~</span></p>
<p></p>
<p><span>THX~~~~</span></p>
<p></p>
<p><span>Jimmy</span></p> Getting no Valid output from Backpropagationtag:www.grasshopper3d.com,2018-10-04:2985220:Topic:19352412018-10-04T13:15:59.283ZTobias Heimighttps://www.grasshopper3d.com/profile/TobiasHeimig
<p>Hey first of all I want to thank you for this really exiting plug-in.</p>
<p></p>
<p>My main Problem is that the output i get from the back-propagation solver and the resulting trained net is different than expected and I am not able to find the reason why ... I hope you can help me. </p>
<p></p>
<p>The basic idea is quite simple. I have a complex fabrication process and a quite complex and computation intensive way of testing if a peace is possible to fabricate. I generated 5000 training…</p>
<p>Hey first of all I want to thank you for this really exiting plug-in.</p>
<p></p>
<p>My main Problem is that the output i get from the back-propagation solver and the resulting trained net is different than expected and I am not able to find the reason why ... I hope you can help me. </p>
<p></p>
<p>The basic idea is quite simple. I have a complex fabrication process and a quite complex and computation intensive way of testing if a peace is possible to fabricate. I generated 5000 training vectors for in and out put in[length,Angle1,Angle2] and out[bool-Possilble]. </p>
<p><br/>I trained the net over 50 000 Cycles using a learning rate from 0.01 . </p>
<p><br/>My basic problem is that the vector classifier outputs the same output_vector for any input vector I choose event-hough if it is identical to the training Vectors ... any idea why ?</p>
<p></p>
<p>I uploaded the script as well ... :) <br/>Thank you for your help. </p>
<p></p>
<p>P.S. I think a method showing the progress of the training while cycling would be very helpful. I know that this is difficult because looping components freeze the Grasshopper Canvas here I upload a easy python component which uses the Rhino Command line to output progress while cycling maybe this would be a possible solution ... </p>
<p></p>
<p>Kind Regards </p>
<p>Tobi</p>