
Progress
Log
Fall
Week 24
​
11/21 - 11/27
Kyle -
Continued work on the project report, updates were made to chapters 3 as requested by Dr. Ejaz in the last meeting.
​
Mark -
Conducted a test again with some volunteers in a simulated four way intersection. In this test two different scenarios were tested, one after another and simultaneous arrival.
​
​
​
​
​
​
​
​
​
​
​
​
​
​
​
​
​
​
​
​
The initial test of one after another on each side of this simulated intersection had a total of twenty trials and from this total 18 out of 20 was successful.
The simultaneous arrival test was not as positive, with just 10 trials in total. This test was shorten as security notified us that we would need conclude the test. Five trials used two vehicles arriving and from this, only once were both vehicles were directed correctly. The four failed attempts only had one vehicle given instruction to proceed.
Next, three to four vehicles arriving was tested and from this, 0 of 5 completely worked. The breakdown consisted of three, 1-3 vehicles receiving an instruction, 2-3 vehicles receiving an instruction and 2-4 vehicles receiving and instruction.
Only once was there an error with module as one was flashing between red and green. The two videos below show a three car arrival trial.
​
​
​
​
​
​
​
​
​
​
​
​
​
​
​
​
​
​
​
​
​
​
Group -
Making adjustment to the enclosures and completing a two vehicle test were out of 10 trials, 9-10 were successful. The single fail from the trial was due to connectivity failure and from there we concluded the test as did not have enough vehicles or space to continue.
​
​
.jpeg)


Fall
Week 23
​
11/14 - 11/20
Kyle -
Continued work on the project report, updates were made to chapters 2-5 as requested by Dr. Ejaz in the last meeting. Prepared relays and control circuitry for the pedestrian lights. Determined final camera placement and mounted cameras to the modules. Cameras were sealed with waterproof silicon sealant to ensure that the modules remain weatherproof.
​
Mark -
Viewing to establish if it is safe to enter the intersection was completely overhauled as initial testing were to everything running from the primary PI. This turn out to be not feasible as the primary could not run both the intersection viewing and vehicle detection simultaneously with two cameras. A work around for this was to have a Pi Zero handle this processing of the intersection and directly have it connected to the primary giving a signal on the GPIO pin to when the intersection is clear or not.
​
Conducted a live indoor test with individuals being the object of interest. Due to a lack of a fourth battery at the time, the test only used three out of the four modules. Out of this testing period two out of the three worked as intended with a confirmation that the light reacted to the incoming individuals. The primary was one of these working two. The second working module initially only had the red LED engage and working, with investigating it was found that the connection to the green LED was not output from the relay. It was thought that the wired connection was not making contact with the relay. The third module did not react to incoming persons, with observation through the primary it looked as if the third attempted to send its command to the primary but nothing was sent back. The Bluetooth connection seems to be the issue and the module was taken down to be troubleshooted. The next day two-way Bluetooth connectivity was reestablished.
​
Another test was conducted with a fourth battery in hand but the battery was not able to power the module and a substitute for it was not working either. Again the test was conducted with only three out of the four module. This test took place outside in the Valencia parking lot with a vehicle. Two out of the three modules began having camera issues with them not initializing, these cameras were replaced with extras on sight. The cameras will be troubleshooted later. As the test commenced each module reacted as intended, giving a command and confirmed through the python console. Confirmed visual commands were also present besides one and this was the same module that had relay connection issues. With testing it was found out that the port of the relay where the green LED was connected was not being responsive at all. A new relay needs to be installed.
Fall
Week 22
​
11/07 - 11/13
Kyle -
Continued completing the report, making necessary updates to engineering requirements and block diagrams.
Mark -
Continued debugging and got the queue to work as intended with a video recording of traffic. The project reacts to incoming traffic and LED signaling is working on a timer for these initial testing.
Group -
Both worked on the completing the electrical build of what will be inside the secondary boxes and mounting the the LEDs on the outside of them. The camera module will be the last to be installed as the optimum position needs to establish.
​
​
​
​




Fall
Week 21
​
10/31 - 11/06
Kyle -
Focused on the report, updates were made to chapters 1 & 2 as discussed with Dr. Ejaz in the previous weeks meeting.
Mark -
Continued debugging of the overall queuing while simultaneously running the detection code. Minor progress has been made as the code was revamped to make debugging easier. Currently the main queue is still does not hold the second arrived vehicle. A suggestion was made to have a queue for the secondary's that would hold the order of incoming vehicles and send them only when requested by the primary microcontroller.
​
​
Fall
Week 20
​
10/24 - 10/30
Kyle -
Completed construction of secondary microcontroller housings. The secondary housings were constructed out of 3/4" plywood with a hinge along the left hand side. Final dimensions of the housings mirror that of the primary housing.
​
​
​
​
​
​
​
​
​
​
​
​
​
​
​
​
​
​
​
Work also began on the report, Chapters 1, 2, and 3 are in progress and being added to the website for review.
​
Mark -
Testing of the object detection with queuing commenced and results have been promising. Currently with the use of videoed traffic at a stop sign, as I currently do not have enough cameras for a live feed. The oncoming traffic are being registered and sent to the Primary pi to be queued and a response is sent back to initialize the green light. More debugging is needed as the delay is not initializing before allowing the second vehicle in the queue to receive their command.
​
​
​
​
​
​
​
​
​
​
​
​
​
​
​
​
​
​
​
​
​
A max distance of the Bluetooth of the microcontroller is theoretically determined to be 60 meters (196.85 ft), this would be more than enough as a typical single intersection lane is about 10 feet. Using the Pythagorean theorem the typical distance of a two lane intersection (20 ft) we will have a max distance of 28.28 ft diagonally across. This is well below the theoretical stated and with a practical measurement of 43 ft when testing the Bluetooth distance physically.
​
​




Fall
Week 19
​
10/17 - 10/23
Kyle -
Work on the queue took place with Mark, in order to send data back to the microcontroller from which it first receives it the primary microcontroller needed to have the MAC addresses of the secondary microcontrollers saved as local variables. Once this was fixed queuing and dequeuing was able to occur.
​
A fan control board was added to the system, this allows for a temperature to be set at which the cooling fan will kick on. Testing will now need to occur to determine the effectiveness of the control board at maintaining temperature within the housing. The temperature probe is placed such that it is nearest to the Raspberry Pi within the housing.
​
​
​
​
​
​
​
​
​
​
​
​
Construction of the secondary housings is underway, they are being constructed of plywood for cost savings. The primary microcontroller will be the "finished product" version, while the secondary units will be in prototype form.
​
Mark -
Development of the implementation of the queue continued in python. Initial developments lead to the incoming data from a pi queuing but data was being sent back when the queue was being emptied. With Kyle's assistance in debugging some progress was made in fixing the issue of receiving data.
​
​
​
​
​
​
​
​
​
​
​
​
​
​
​
​
​
​
​
​
​
​
​
​
​
​
Now data would be received, queued and sent, but another concern came about as a indicator would be needed to differentiate the data coming from the three different pi's. The Bluetooth MAC address of each pi would need to be hard coded as a variable matching the received data and a if statement would use this variable to initiate the sending out of data from the queue.
​
​
​
​
​
​
​
​
​
​
​
​
​
​
​
​
​
​
​
​
​
​
​
​
​
This code as constructed works with manual manipulation, what is needed next is a live test that will allow specific changes for this code to work without interference from a user.



Fall
Week 18
​
10/10 - 10/16
Kyle -
Final assembly of the primary microcontroller case took place along with testing of the solar charging time. A weatherproof housing was utilized for the enclosure, the enclosure is IP67 rated meaning that it is waterproof and dustproof. An IP67 rated connector was also utilized to run the power cable from the solar panel to the housing. The location of the cooling fan still must be determined, as well as the final bracketry for the solar panel welded together.
​
​
​
​
​
​
​
​
​
​
​
​
​
​
The testing of the solar charge time showed that charging from 12.4V to a full charge of 13.3V takes under 2 hours. It was previously determined that in an 8 hour period the system would drain the battery 0.4V.
​
​
​
​
​
​
​
​
​
​
​
​
Mark -
Through discussion with Professor Ejaz he gave concern about the newly chosen transmission (MQTT) and suggested to use Bluetooth instead. The need for a constant internet connection would bring more cost than needed with the upkeep of a hot spot. Bluetooth communication would be simpler to handle but at the cost of range but with project being in a prototype it should not to much of a concern as higher quality modules who alleviate this. For initial testing the onboard Bluetooth of the raspberry pi's were used as the base. The Bluetooth system that was used in this development was RFCOMM, this is a socket programming model. This model created a simple script of server and client and this developed to handled more than just pi's communicating.
​
​
​
​
​
​
​
​
​
​
​
​
​
​
​
​
​
​
​
The figure above displays the code as well as the connection being establish between just two pi's. The next figures below will show the communication between the three as the first will have the main pi in the top left receive data from the other three.
​
​
​
​
​
​
​
​
​
​
​
​
​
​
​
​
​
​
​
​
​
​
​
​
​
​
​
​
​
​
The second figure now will have the main pi in the same position but sending data to the other three.
​
​
​
​
​
​
​
​
​
​
​
​
​
​
​
​
​
​
​
​
​
​
​
​
​
​
​
​
​






Fall
Week 17
​
10/3 - 10/9
Kyle -
​
​
Mark -
Started development of a different form of transmission and with prior research began constructing code around MQTT. Development lead to the connection to the MQTT broker which routes the communication between devices connected through a network.
​
​
​
​
​
​
​
​
​
​
​
​
​
​
​
​
​
​
​
​
​
​
​
There were multiple instances of package errors but with debugging this step has been cleared. The next step now is having the other raspberry pi's connected through this network to receive and send commands as it relates to the project.
​



Fall
Week 17
​
9/26 - 10/2
Kyle -
​
Mark -
For better testing and code debugging a video recording of where the project would be placed at the intersection was taken in the day and night. To address some of the hyper sensitivity of the object detection, an object area criteria was made to take more of the complete size of the approaching vehicle in account. In first test with the object detection as the vehicle would approach closer to the camera the program would try to compact the object in question in many smaller boxes instead of just combining all as one. This change was not full proof as it would still detect in other areas of the screen but with the program only logging objects as they pass a line from top to bottom it is a good way to keep false positives to a minimum. Also objects that are coming in the direction of bottom to top would be completely ignored and not give a false reading.
​
​
​
​
​
​
​
​
​
​
​
​
​
​
At night or in low light conditions it is harder to take in a greater area to detect a vehicle as the headlights give an added area of detection and depending on the type of light used (e.g. HID) these can extend the area from a longer distance and therefore have the same issues with having multiple detection. To circumvent this a timer can be put in place to prevent the queue will not log in another rapid detection. Live feed would work best with this as the processing of a recorded video has a slowed frame rate which would not give an accurate time frame for this possible timer condition.
​


Fall
Week 16
​
9/19 - 9/25
Kyle -
Due to personal and family issues progress was not made this week with the enclosure and complete power draining testing.
​
Mark -
The transceivers were continually tested for proper working order. Through these tests only one secondary transceiver would properly send and receive data assigned to it. The second secondary transceiver would only receive data assigned and the third would not receive or send data. The primary transceiver would work fine with the first secondary transceiver but with the second the primary would somehow cross lines with the first secondary and receive its payload as if it was the second's own. With further analysis the third secondary transceiver was changing its addressing pipe on its own from what was assigned to it.
​
​
​
​
​
​
​
​
​
​
​
​
​
​
​
​
In the figures above it shows where the pipe address matches between the primary and secondary in question (left being secondary and right being primary) but in the next figure below the address pipe will change from what it was programed to be.
​
​
​
​
​
​
​
​
​
​
​
​
​
​
​
​
I have not found a possible explanation for this change and for the time being will focus on finding a solution to making the object detection more accurate or finding a way to deal with false positives during detection for queuing.
​



Fall
Week 15
​
9/12 - 9/18
Kyle -
Researched new waterproof enclosures for the project. An enclosure was ordered that met all of the necessary requirements, however upon arrival the enclosure measurements were discovered to not match those given in the description online. A second enclosure was ordered and is expected to arrive on Thursday 09/16. The enclosure sizing issue prevented any further testing from taking place.
​
​
​
​
​
​
​
​
​
Mark -
Finishing the second half of the transmission code which focuses on the cycling of addressing to receive data from the other pi's. Through testing it was found that within the function each separate Rx address was not cycling as it would not stop listening for incoming data until it had received something. To alleviate this a force stop with a timer was implemented to continue the cycle.
With the concern of object detection reliability in low light conditions, tests were made first to see if it could pick a stationary object and then moving object. The time frame for these tests were made at 7:50 PM and 5:30 AM. The outcome from these tests showed that objects can be detected but more with movement. There was a downside to the detection in movement though as the closer object comes, in this case a vehicle the more OpenCV tries to gather more detail and begins to separate boxes as if there are multiple objects in frame. This can be attributed to how the detection is separating movement in the frame by subtraction and binarization. Research and an adjustment is needed to correct this issue.





9/5 - 9/11
Fall
Week 14
​
Mark -
At the end of the test for faulty components it was confirmed that one of the four NRF's was indeed unresponsive. With the assistance of Professor Redd extra components were made available. To ensure everything would be in working order each NRF was rewired with each Pi. Now python testing for the use of multiple addressing started with making changes to what would be the Rx address.
​
​
Initially the Rx was changed to the arrangement of 0xb2, replacing all of the 0xc2. A send and receive test was made to establish a link. The link was successfully made and coding was tweaked around that. Now with additional testing with three Pi's the primary testing of a send and receive was again deemed to be successful. Efficiently cascading the coding functions of "Send" is a concern as losing packet data is a problem. Now the reverse would need to be tested and debugged. Currently the transmitting and receiving of the NRF is not simultaneous and therefore could be an issue of time delay and not being open to receive or transmit. A possible solution with the added NRF components is to have a single NRF dedicated to transmitting and receiving, needs further testing.
​
Kyle -
Testing of the complete system began. The system was first tested on battery power alone to determine the maximum run time. Then the system was tested with the solar panel connected to trickle charge the battery up from its depleted state. During the trickle charging testing the enclosure overheated causing the 3D printed housing to deform. Testing was halted as the enclosure was no longer structurally sound. Data from the battery testing can be seen below, testing will likely need to be repeated with a different enclosure design.
​
​



8/29 - 9/4
Fall
Week 13
​
Mark -
As the re-coding of the NRF module continued Professor Ejaz was contacted for insight. Professor Ejaz suggested look into assigning specific addresses as the current addressing has been more up to chance in establishing a connection with more than two NRF modules. If the problem of transmission still continues then the idea of Bluetooth was brought up. Bluetooth could be a substitute but it has its cons as the range would be lower in comparison to the NRF24L01. As test continued with the NRF it has raised a concern as the NRF seemed not to be responsive at all and possibly have become faulty. A simple setup was made to test this and it leans to that being the case.
Kyle -
Design of the Android application began using Thunkable, a block coding software. The application will utilize bluetooth to communicate with the primary microcontroller. The logic for the application is still being worked on and will be implemented into the overall code for the microcontrollers once finished. A preview of the application along with the code can be seen below.


8/22 - 8/28
Fall
Week 12
​
Mark -
During the break in between the Summer and Fall semester not much progress was made in the programming of the NRF modules. Some research was made in another from of communication being a protocol of IoT called MQTT (Message Queuing Telemetry Transport). MQTT is a publish-subscribe network protocol that transports messages between devices. With new findings about the NRF module with addressing an attempt was made again to re-code for it. In the attempt of re-coding a test was made and was unsuccessful. Continued debugging was made into the next week.
Group -
On Friday on this week a team meeting was held where the proposed new schedule was made, task for each team member was assigned for the coming week and a possible substitute for enclosure material if the 3-D printed enclosure can not hold up against the elements
Week 10
​
7/11 - 7/17
An update was made to the queue logic flowchart to create separate modes for the code as suggested by Kyle's friend Justin. Additional changes were made to the code including the addition of a dummy variable to travel with the data packet being sent from transceiver to transceiver. Testing was performed at the college with two microcontrollers to see how the communication had improved following the code updates. With the attempt having the microcontrollers linked to a specific address for transmission lead to garbage data being received. More research is needed on addresses and pipes for transmission.


Week 9
7/04 - 7/10
Issues are still occurring with the transmission of data between microcontrollers to let the primary microcontroller know when a vehicle has been detected. Continued testing of the code occurred and the code was able to transmit the detection of vehicles from the sample video. This transmission was able to send data from the Primary to two Secondaries microcontrollers, with continued testing it was realized that the data sent through one default address and pipe. We would need different addresses to be associated each Secondary microcontroller or use this would be an issue with every microcontroller receiving the same data at the same time. A meeting was set with Kyle's friend Justin to go over the code and look for any bugs that can possible fix this issue.

Week 8
6/27 - 7/03
Through debugging it was because of low memory that made the frame rate slower than intended. To remedy this the code was restructured to only run OpenCV. OpenCV will convert the incoming frames to black and white to better detect and recognize objects.
​
From this point a live test of an oncoming vehicle was made and from this the frame rate was more stable. As the vehicle made contact with the boundary line a count of it was made but the program became overly sensitive and began to register background object and count them.
With more debugging the over sensitive was fixed and the transmission of the count
commenced. When attempting to transmit the count the terminal showed a loop of attempts until the terminal stop the code, showing an error of pigpiod "no available handles". With the advice of Professor Reed the GPIO was getting overloaded with the loop of the counting code. To possibly fix the problem it was suggested to make a multi-thread and call for specific function one at a time.



Week 7
6/20 - 6/26
Got in contact with Etin-Osa for advice in the use of Tensorflow. He was able to give references on a more compact version called Tensorflow Lite. Through research and trial and error it was found that to properly make use of Tensorflow a virtual environment would be need on the Raspberry Pi. Once this parameter was set, Tensorflow and the object detection code worked but the registered frames were low.
​​
​​​Following an example of vehicle detection and counting, the initial run still had a low frame input but could still register vehicles in frame. The low frame input could be attributed to two possibilities, the resolution requested by the code or low memory from the Pi. Further testing and debugging needed.
Week 6
6/13 - 6/19
With Professor Reed's advice the transmission code is now successfully transmitting data between multiple Raspberry Pi's. The Primary Pi can now transmit and receive data to and from the Secondary Pis.
​
​
​
​
​
​
​
​
​
​
​
​
​
​
The incoming vehicle queue code construction is based on the Circular Queue with a priority function for pedestrian inclusion.
Below is an example of a Queue system from https://www.geeksforgeeks.org/priority-queue-in-python/.
This is used as a base of reference.
​
​
​
​
​
​
​
​
​
​
​
​
​
​
​
Additions were made to the overall report.
​



Week 5
6/06 - 6/12
Continued code construction of data transmission with email communication with Professor Reed and researching examples of queue creation. Camera operation was stagnant due to unforeseen circumstances outside our controlled.
Week 4
5/30 - 6/05
Restructuring of the transmission code has been more difficult to achieve results. The initial transceivers are more structured for Arduino use. There are instances of it being used with a Raspberry Pi but resource are outdated by an average of 6 years. Use of another transceiver has been placed (433MHZ Transceiver). This transceiver is more Raspberry Pi friendly and resources are more up to date.
Testing continued with the camera module and the queue logic was created. Additions was made to the overall report.
Week 3
5/23 - 5/29
The decision was made to remove the intended lidar system and replace with a camera system. This decision was backed from complications with the lidar and with the advice of Alfredo Rodriguez, an Engineer. The camera system will operate on image processing and object detection. With this change in system more processing power would be needed. Through research and advice of colleagues a Raspberry Pi would satisfy this need.
Coding and testing began with the operation of the camera. The change in microcontrollers will now altered the code used for bi-directional communication.
A prototype of the potential enclosure was made in SOLIDWORKS.
​
​


Week 2
5/16 - 5/22
From the previous meeting the idea of having the the camera system replace the lidars. The concern of interference affecting the lidars has brought about extended research of the Pixy2 capabilities of line and object tracking. OpenCV was also researched as a source for image processing.
Continued testing of the lidar was made to confirm if outside interference would drastically affect it. Also solar panel testing was made.
​
​
​
​
​
​
​
​
​
​
​
​
Continued progress of bi-directional communication code has now been tested to send data across three microcontrollers. The code needs to be adjusted to transmit the data received from the system


Week 1
5/09 - 5/15
Parts began coming in and Kyle began mild testing with the lidar and started the first chapter of the design report. Mark began building code for the microcontrollers to communicate with each other bi-directionally with the NRF24L01 transceivers.

