Literal Odds and Ends
-
- Posts: 12950
- Joined: Wed Nov 30, 2016 10:27 pm
- Location: The Great Place
Re: Literal Odds and Ends
4 year old: Why you shooting him?
7 year old: Cause I want to.
4 year old: You're shooting your teammate!
We might need some lessons on honor.
7 year old: Cause I want to.
4 year old: You're shooting your teammate!
We might need some lessons on honor.
GrumpyCatFace wrote:Dumb slut partied too hard and woke up in a weird house. Ran out the door, weeping for her failed life choices, concerned townsfolk notes her appearance and alerted the fuzz.
viewtopic.php?p=60751#p60751
-
- Posts: 38685
- Joined: Wed Nov 30, 2016 5:59 pm
Re: Literal Odds and Ends
I might be the only software engineer left in the forum, but in case anybody wants a challenge..
I challenge you to defeat me at building a neural network that distinguishes between mines and rocks sonar data from the following data set:
http://archive.ics.uci.edu/ml/datasets/ ... .+Rocks%29
Original research paper that provided the data set from naval research managed to get 84.5% on the training set of aspect angle independent targets. On aspect angle dependent targets, they managed to score 89.2% correct identification. They report human sonar operators on the same data scored between 88% and 97% correct identification.
Original paper used a simple neural network with a single hidden layer with varying hidden nodes, using the standard backpropagation algorithm.
Can you beat the researchers? The human sonar operators? Or my code?
I challenge you to defeat me at building a neural network that distinguishes between mines and rocks sonar data from the following data set:
http://archive.ics.uci.edu/ml/datasets/ ... .+Rocks%29
Original research paper that provided the data set from naval research managed to get 84.5% on the training set of aspect angle independent targets. On aspect angle dependent targets, they managed to score 89.2% correct identification. They report human sonar operators on the same data scored between 88% and 97% correct identification.
Original paper used a simple neural network with a single hidden layer with varying hidden nodes, using the standard backpropagation algorithm.
Can you beat the researchers? The human sonar operators? Or my code?
-
- Posts: 25285
- Joined: Wed Nov 30, 2016 6:50 am
- Location: Ohio
Re: Literal Odds and Ends
Probably involves looking at the average of energy returned, and variation in signal strength. I could write something like that in SQL, and give a probability for the object's nature.
But I have a job.
But I have a job.
-
- Posts: 38685
- Joined: Wed Nov 30, 2016 5:59 pm
Re: Literal Odds and Ends
GrumpyCatFace wrote:Probably involves looking at the average of energy returned, and variation in signal strength. I could write something like that in SQL, and give a probability for the object's nature.
But I have a job.
I think it would be pretty hard to do it directly. Training a neural network to distinguish between them is pretty easy. I am almost done writing it and probably put about four hours total into it. And I am actually trying to write this with a mind to expand this stuff out into a larger library of my own for more difficult problems.
You could probably write a little ANN program in Python in an hour to do this.
The challenge is in beating the benchmarks of the original research paper and hopefully the human sonar operators.
-
- Posts: 25285
- Joined: Wed Nov 30, 2016 6:50 am
- Location: Ohio
Re: Literal Odds and Ends
No human being is scanning a giant table of 3-decimal numbers for a pattern. That's worse than a labor campSpeaker to Animals wrote:GrumpyCatFace wrote:Probably involves looking at the average of energy returned, and variation in signal strength. I could write something like that in SQL, and give a probability for the object's nature.
But I have a job.
I think it would be pretty hard to do it directly. Training a neural network to distinguish between them is pretty easy. I am almost done writing it and probably put about four hours total into it. And I am actually trying to write this with a mind to expand this stuff out into a larger library of my own for more difficult problems.
You could probably write a little ANN program in Python in an hour to do this.
The challenge is in beating the benchmarks of the original research paper and hopefully the human sonar operators.
-
- Posts: 38685
- Joined: Wed Nov 30, 2016 5:59 pm
Re: Literal Odds and Ends
The numbers are extracted from whatever sonar interface the operators use, I'd imagine. I don't know much about sonar, but I do know a little bit of radar, and it's not like they just display raw data on the screen. It's visualized somewhat.GrumpyCatFace wrote:No human being is scanning a giant table of 3-decimal numbers for a pattern. That's worse than a labor campSpeaker to Animals wrote:GrumpyCatFace wrote:Probably involves looking at the average of energy returned, and variation in signal strength. I could write something like that in SQL, and give a probability for the object's nature.
But I have a job.
I think it would be pretty hard to do it directly. Training a neural network to distinguish between them is pretty easy. I am almost done writing it and probably put about four hours total into it. And I am actually trying to write this with a mind to expand this stuff out into a larger library of my own for more difficult problems.
You could probably write a little ANN program in Python in an hour to do this.
The challenge is in beating the benchmarks of the original research paper and hopefully the human sonar operators.
-
- Posts: 25285
- Joined: Wed Nov 30, 2016 6:50 am
- Location: Ohio
Re: Literal Odds and Ends
My point is that we can't 'compete with the sonar operators' unless we're using their highly specialized software. We could compete with each other, sure.Speaker to Animals wrote:The numbers are extracted from whatever sonar interface the operators use, I'd imagine. I don't know much about sonar, but I do know a little bit of radar, and it's not like they just display raw data on the screen. It's visualized somewhat.GrumpyCatFace wrote:No human being is scanning a giant table of 3-decimal numbers for a pattern. That's worse than a labor campSpeaker to Animals wrote:
I think it would be pretty hard to do it directly. Training a neural network to distinguish between them is pretty easy. I am almost done writing it and probably put about four hours total into it. And I am actually trying to write this with a mind to expand this stuff out into a larger library of my own for more difficult problems.
You could probably write a little ANN program in Python in an hour to do this.
The challenge is in beating the benchmarks of the original research paper and hopefully the human sonar operators.
-
- Posts: 38685
- Joined: Wed Nov 30, 2016 5:59 pm
Re: Literal Odds and Ends
GrumpyCatFace wrote:My point is that we can't 'compete with the sonar operators' unless we're using their highly specialized software. We could compete with each other, sure.Speaker to Animals wrote:The numbers are extracted from whatever sonar interface the operators use, I'd imagine. I don't know much about sonar, but I do know a little bit of radar, and it's not like they just display raw data on the screen. It's visualized somewhat.GrumpyCatFace wrote:
No human being is scanning a giant table of 3-decimal numbers for a pattern. That's worse than a labor camp
Then give up. You just have to beat 97%. I think I can do it, but I will have to play around with network architecture. My guess is that the original researchers didn't use enough layers, since there obviously exists more than one feature in each reading.
My back is still sore, so I am limiting coding a little today, but I will get back into it tonight. I already ported some of my earlier backpropagation code into it and got the parser written to get all this data into a training set my ANN code can use. I just need to build the network and play around with architecture now.
-
- Posts: 25285
- Joined: Wed Nov 30, 2016 6:50 am
- Location: Ohio
Re: Literal Odds and Ends
Again, just feed Metals into a SQL table, Rocks into another table, take the averages and compare. You MIGHT have to examine variability, which would just be looking at the total range of signal. There's a pattern in there that shouldn't be too hard to find. Apply your pattern to the unknown signal, and done.Speaker to Animals wrote:GrumpyCatFace wrote:My point is that we can't 'compete with the sonar operators' unless we're using their highly specialized software. We could compete with each other, sure.Speaker to Animals wrote:
The numbers are extracted from whatever sonar interface the operators use, I'd imagine. I don't know much about sonar, but I do know a little bit of radar, and it's not like they just display raw data on the screen. It's visualized somewhat.
Then give up. You just have to beat 97%. I think I can do it, but I will have to play around with network architecture. My guess is that the original researchers didn't use enough layers, since there obviously exists more than one feature in each reading.
This task doesn't require a neural network, or advanced programming, unless you're trying to actually visualize the object itself.
#oldschoolprogramming