Testing in isolation with Symfony2 and WebTestCase

It’s extremely important to have same state of the System Under Test. In most of the cases it will be possible by having same contents in a database for every test. I’ve decribed how to achieve it in Fully isolated tests in Symfony2 blog post about two years ago (btw. it’s most popular post on this blog). It was the time, when PHP’s Traits weren’t that popular.

In IsolatedTestTrait.php I introduced idea to rebuild schema with fixtures from scratch into sqlite file database. It will be done once, at the begining of test suite, then it will be copied and reused for each test in given test suite. In the end, file will be removed in tearDownAfterClass method. This significantly increases performance, since you don’t rebuild schema and don’t load fixtures for every test. It is also clean and non-intrusive, because you only type use IsolatedTestTrait; in tests cases that test something related to database state.
Now you can easily conduct functional or integration tests within consistent, isolated environment.

PS. LiipFunctionalTestBundle comes with similar concepts, maybe it’s worth having a look.

IsolatedTestsTrait is available here as a Gist.

For your convenience I’m also putting the code below. Feel free to comment and propose improvements!

Read more

Injecting repositories to service in Symfony2

It is generally a good idea to wrap business logic into services. Often, such services methods uses doctrine’s repositories to operate on data storage. Injecting whole EntityManager service is very popular approach, but it isn’t the most elegant way I could think of. EntityManager works only as a factory in that case and could lead to usage of other repositories, which might end up with too many responsibilities of given service.
The better way, is to inject single repositories, using factory-service mechanizm, provided by Dependency Injection Container.

Services.xml configuratio consist of repository declaration and bespoken service:

Below we have an implementation of our custom entity repository. It doesn’t do anything special at this moment, but is prepared to be developed in the future.

The last but not least our service, which will make use of injected repository.

One more thing: I suggest to type hint to EntityRepository, because it will be ready to provide default repository (not your own). This also decouples service from repository implementation, which is good thing in terms of Dependency Inversion principle.

UPDATE 2013-10-15
After receiving very useful feedback in comments below, I have something to add. It’s better to have repository which implements your own interface, but this interface should extend ObjectRepository interface from Doctrine\Common\Persistence namespace. With this approach repository’s contract (custom methods) are specified in interface and this interface also denotes ObjectRepository behaviours (as it extends this interface). Moreover, this repository extends EntityRepository, which gives it all needed things to interact with query builder and entity manager. One disadvantage is that, the interface is coupled with Doctrine Persistance, but it’s okay in this case.

Read more

Android Meteoapp released as an Open Source

More than year ago I’ve played a little with Android Java SDK and I’ve created proof of concept of Meteoapp application. It fetches an meteograms from new.meteo.pl, cuts them into 6 parts and displays choosen parts for a given city. Meteoapp uses meteo library to interact with new.meteo.pl. This library has been also open sourced.

I use this app a lot. In fact it’s useful only for Polish people, but maybe someone would like to install it or even hack with the code.

For some reasons this isn’t available on Google Play marketplace, but you could install it directly from my website: sznapka.pl/meteoapp.apk

Here are sources (link) – have fun :-)

meteoapp-screen-main

meteoapp-screen-settings

meteoapp-screen-credits

Read more

Gender guessing based on name in PHP

Today I discovered neat PHP extension named Gender: http://www.php.net/manual/en/book.gender.php. It determines gender based on name and country. Probably this class will be useful on some point of your software development career :-)

Read more

Thoughts after Symfony Live Berlin 2012

Lately XSolve team had a great opportunity to attend to Symfony Live conference in Berlin. That was our second chance to meet Symfony community after successful edition in London this year.

Germans are perfectly organized and they showed this to attendees this time. They’ve choosen very exciting venue, located in former church, with organs above main scene. Everything was right – place, seassions, food and party after last talk.

The most valuable part of this conference were talks. My favourite was “Symfony2 Form Tricks” by Bernhard Schussek, but also “What’s new in Doctrine2″ and “Practical REST”, were very interesting. What was very important – there weren’t any poor talk, which speaks for itself, if it comes to sessions quality.

To sum things up: Sensio Labs DE (the organizators) did they job very well and it was worth to have a such a good time in Berlin.

You can also find out ashort summary on XSolve’s profile and view some photos on Flickr

Read more

Is Symfony2 a MVC framework?

Is Symfony2 a MVC framework? This question in tricky and you can’t see answer at the first moment when you visit symfony.com website. Even more – MVC pattern isn’t mentioned in elevator pitches in “What is Symfony” section. Most of PHP developers will tell you promptly, that this is Model View Controller implementation without a doubt.

The question is: where is Model layer in Symfony2? My answer is: there’s ain’t one and it’s good..

Symfony2 isn’t hardly bounded with Model layer, as it could have been seen in first version. We can see many Active Record model implementations in modern web frameworks such as Ruby on Rails, Django and symfony 1. I’m not saying those solutions are bad. The main problem was, that big systems maintained by plenty of developers changing over time tends to have a messy codebase. The common place to locate business logic were Model layer. The result of that approach were huge model classes, randomly structured Table/Peer classes full of static method and general feeling that system is hardly impossible to maintain anymore.

The problem: complex systems

Nowadays Internet and business needs agile team working on sophisticated and higly complicated systems. Simple problems were solved years ago. If one wants to make a success with his web application, he needs to solve tough problems. Those systems are impossible to be produced with old fashioned Watterfall software development process. Something better fitting should taken into the consideration to adapt to current conditions. Most of development process can be covered by agile tools such a Scrum or Kanban.

All those cases, from architectural point of view, could be solved by Domain Driven Design

The solution: Domain Driven Design

It’s a approach and way of thinking about complex systems. It supports best known design patterns, good separation of concerns, agility and good communication layer between technical and business people, for example by using ubiquitous language.

So if you ask, where should I look for a Model in Symfony2 projects, I’ll tell you: look for it around the domain. Symfony2 has plenty of capabilities that could be used for developing system with DDD approach, I’ll name just a few of them.

Read more

Export colored Behat scenarios to PDF

Everyone falls in love with Behat. It’s powerfull tool for testing business requirements in PHP. Behat scenarios, written in cross-platform Gherkin language, are readable for developers, easy to understand and verify for business people and executable by machine, which goes through an application to prove that it works as expected.

Behat scenarios are one of the best ways to describe system. UML Use Cases or tons of pages in SRS documents are fine, but hard to understand from the begining, and even harder to maintain in the future. Behat eases this process and also gives opportunity to automate requirements verification.

To write Behat scenarios you need a text editor. I’ve picked my favourite – Vim, which highlights *.feature files syntax. But business people mostly don’t use Vim, so I need to figure a way, to expose scenarios in easy and pleasant way. There are few steps to get it done:

  1. Merge all features in one file
  2. Open it in Vim
  3. Make hardcopy
  4. Convert hardcopy output (PostScript) to PDF
  5. Send it to customer :-)

Here’s how we do it from technical point of view:

1. Find all feature files and cat them to one file (assuming your .feature files are located somewhere in src/ directory)

2. Open Vim and configure some printing options

Some remarks:

  • font is platform dependant, you should adjust it by yourself
  • printencoding set to latin2 is required for polish encoding (even if features were written in UTF8)
  • colorscheme is also a matter of taste, default one looks good in PDF

3. Now we are ready to create hardcopy, which is simple print to the PostScript file:

4. In terminal convert *.ps file to PDF (or whatever else format you like):

5. Et voilà, your PDF is ready to send to the customer, example can be seen below

Behat scenarios in PDF

 

Read more

Let’s meet at PHPCon 2012

During upcoming edition of PHPCon 2012 in Kielce I’ll give a talk “Symfony2 w chmurze” (Symfony2 in the cloud). I’ll describe advantages of cloud infrastructure purposed for web applications, some use cases of cloud deployments and things which developers need to keep in mind, to get Symfony2 application work properly in such environment.

More informations are available in PHPCon agenda:http://phpcon.pl/2012/pl/agenda

See you there :-)

Read more

Deploying Symfony2 applications with Ant

When you have plenty of Symfony2 applications and you need to deploy them from time to time, you are probably sick of thinking about every activity you need to do for every deploy. Often people use some build scripts, which are hard to maintain and tends to be unreadable. I wanted to automate it as much as it possible and I’ve choosen Ant, to help me out.

Actually Ant is choice, due to other fact – it can be easily used with Continous Integration server like Jenkins, while ssh scripts often generates some problems. With this aproach all you need to have Ant binary on server and build.xml config in root folder. You can have different targets defined in config and chain them using depend attribute. So in this case you can have target for building project on production server (usefull for continous delivery) and setup for Jenkins.

Read more

Suitable solutions for high scale systems

Sometimes I wonder what should be the most suitable stack for handling huge traffic. Let’s consider online gaming portal, like http://ru.partypoker.com They don’t provide traffic stats, but we can imagine scalability issues they struggling with. Alexa.com states, that http://ru.partypoker.com has 674 Global rank if it comes with traffic, so this number speaks for itself. So which tools should you use, to build reliable and responsive system like this? For sure, there must be optimized servers infrastructure. Master-slave setups of database servers (like MySQL), reverse proxy (like Varnish) and load balancers (Squid should do his job here). Of course we need to think about highly tuned webservers like NGnix with FastCGI server side engines (PHP, Python or Ruby should be fine). Also some cache mechanisms should be considered. Memcached is one of most known tools for handling in-memory cache and besides its great results it’s very easy to implement and integrate with existing libraries and frameworks . The last, but not least is handling static content. The only way of doing it right is to use CDN. You can setup your own CDN or use something existing, like akamai.

The servers setup is one thing, but other thing is developing code for such system. In my opinion NoSQL solutions comes for the rescue. Tools like MongoDB, CouchDB or Redis have been created with approach for handling big amount of unstructured data. Also they are crafted to work in high load. I have seen MongoDB database with 50 000 000 of records working smoothly and without any problems. We’ve done a benchmark to fill that collection and query it and results impressed us. I don’t think that we could achieve same thing with any MySQL setup, even with first class hacks and tweaks.

Read more
older
Back to top